var/home/core/zuul-output/0000755000175000017500000000000015154254030014524 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015154267120015474 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000222657415154267007020277 0ustar corecorenikubelet.lognc9r~DYd` \-Hږ%C{sg5݁ϱ(Ӄis$WU)X6J?"mv?_eGbuu񯷑7+%f?7ݭ7֫gu% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|g\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\_.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €' S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>ClsXT(VIx$(J:&~CQpkۗgVKx*lJ3o|s`<՛=JPBUGߩnX#;4ٻO2{Fݫr~AreFj?wQC9yO|$UvވkZoIfzC|]|[>ӸUKҳt17ä$ ֈm maUNvS_$qrMY QOΨN!㞊;4U^Z/ QB?q3En.اeI"X#gZ+Xk?povR]8~깮$b@n3xh!|t{: CºC{ 8Ѿm[ ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtWG uIo1]ߔr TGGJ\ C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>/1:N3cl.:f 3 JJ5Z|&הԟ,Tصp&NI%`t3Vi=Ob㸵2*3d*mQ%"h+ "f "D(~~moH|E3*46$Ag4aX)Ǜƾ9U Ӆ^}ڲ7J9@ kV%g>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJBR_v'5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F PsSMiI S/jﴍ8wPVC P2EU:F4!ʢlQHZ9E CBU)Y(S8)c yO[E}Lc&ld\{ELO3芷AgX*;RgXGdCgX JgX2*Ъ3:O7ǭ3ږA :}d,ZByXϯ&Ksg3["66hŢFD&iQCFd4%h= z{tKmdߟ9i {A.:Mw~^`X\u6|6rcIF3b9O:j 2IN…D% YCUI}~;XI썋Fqil><UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'Tស[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ?lm$K/$s_. WM]̍"W%`lO2-"ew@E=!|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d<-l9fbn,'eO2sٟ+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt?2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧ"0(HKkD4<80: M:'֥P!r "Lӓݰ@ 9n# " $fGgKQӦ4}Gn\^=-Y5PI dPN6 Ozځ/פ|5) F[ڣ$2*%&h v%9HN H~Q+oi?&۳)-nqK?2ސv/3,9ҮT9Cef˝49i.2DxatC<8iR/ƬйR֌vN8J"iJ. T>)qaY4ͬlyg "]BvW#99`TegõII kюHLa^c&/H^FFIu`2a$mc Ry+R:LڕDܓ>Y:]t.+|PT6=qWe0NƏw<6o3mv8k vGOfpEOkÈWȤMف lOc;SR&.w,qk>MPs+Xh4iyuGRd֞q鮺]m S{}]U kV0/ŜxtADx"Xh4|;XSxߵă@pE:y]/"(MCG`ʶϊGi+39#gNZYE:Qw9muB`9`LDhs4Ǩ9S`EkM{zB<˙ik; JD;;3!4 2Y.$Dwiu|+lO:k$]ԜYLUҞ6EmH>azʳ/A+ԀZk"f`.,ל{=wh|_qYj5M{K$gv>cDp"'0޽5xCNQ1G2})*'>fC۝'*)"5.E2IeD 2.ZdrN6Uœ=n8D-9޵JKw5ُJ,􋃓ZUꋼ0b1f87GՂ 1t_o}{Mr7KO0Ao-Y*Is\S:JzA(:i!eҎ\,f+,Ąt78~ڋ~?[F^.A'!,iGow3{'YToҝf5ޓ[he>=7S8DGZ@-#]f:Tm?L{F-8G#%.fM8Y='gیl0HڜHLK'Cw#)krWIk<1څ 9abHl:b3LjOq͂Ӥ=u8#E2;|z꽐vɀi^lUt␚ɓW%OVc8|*yI0U=nFGA`IC8p+C:!}Nh,mn>_MGiq'N~|z`|mu}r:"KiyGҪ$& hw#4qn?ܶХfm_Ov^ܶ[6j3ZN9t9ZMMM)I[Rχ/C|W䳮yI3MڼH9iEG&V 'x`u.̀ab7V<*EzfH{]:*6M x-v쳎M'.hO3p-IGh ܆hR ]zi2hB9'S_;I/d0oIU:m/~[*K1QA="D:V&f:{7N>^uU` c/X)mS5KC߄":{H)"%,!3w{"ZWÂk>/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjsٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu/bn/34'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҎÒJH+"ūyR}X~juPp- j\hЪQxchKaS,xS"cV8i8'-sOKB<չw"|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$E)~?wy,u'u() C>Gn} t]2_}!1NodI_Bǂ/^8\3m!'(Ֆ5Q&xo 8;'Jbo&XL_ʣ^^"Lq2E3,v1ɢu^}G7Z/qC^'+HDy=\]?d|9i,p?߼=\Ce"|Rݷ Q+=zxB.^Bld.HSntºB4~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v3n+p9!@@Dv\W gztT6VUsBjmBz-p^:ZYUv`Ƌ-v|u>r,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[R!Y1Qi F>wTLHUGӃ\ԗu`5[Z!]nlnݔm\Ǡ*_lDni9V0ټ_`#U8VdTtD_*EX-sJb?U'3X7J4l+Cj%LPcm${|Xdu4tmtїUJ\~dc0KcMlf2ǝyW^OiXC٩ȦD\!~s7[ NRCǔd X13։: F]agB-:%ގީ׵Oj|Yb:.͘C4z 6qe6J61R$Eh3ŕS,|HVQ6~ۮ 馏SVL l)v}Yg%1C+t;H+NL$k~:$TiVD7ֶ]cga@>\X=4OZS׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ$*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S-EasUNfB7™:%WY ]LXg3۾4\.?}f kj·dMGKaVۿ$XD'QǛU>UӸRR?xYTE.1?VwխmLaF݄_",Uy%íaz,/ooZ^]ݖF\\UR7򱺹...^'w o7nEw!7xU!|q˯b_=-[~Mp?Cz .e7"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKooa_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ'`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ6NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"re5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSPW k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&d_&i"L{vrQۻu}q}hn~+.gpWEqws]],ǫ\,J.MLmc /ԗWrU,Ǜ+sXn[ﯾeywyY]]¨Kpx c./mv>9"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsTI|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰN޵+ȗ"%pڴ N6gY,JlU=d;#ɉ&Q[7jԖp8 IVAP& Q4qxד-%2Vd jSt+̏U ,C Ǫ H` Q04xGEWrsWؔc iUQ*rHZG0ZŨE7eUϛ㬸=I.A!R$2V`ĩLأ`J,~ NՌ1V bcp̅]KHD dt9Xկ AO= Rӥ1>g?Jg:ьiQG/iB菗8 xO}sc@<8ϐ=2,o{%e+FKVa^d =7 3_iy22Iyq ' yD(-mM׺Z^KMZ M!Cr]El:5ui\-6tõ^shӠG-( ~"&1L`zӾMnZw,@"MF"kGnrK70H/+Q9Klq @lxIa E&ώ*ѨLaI]_碩^,Yz#3p-oPvʘ!BoF,iq6̂*nb?Qio ),CkYe ֊2.Jό}fDpk1iaevE2uӲlC6\5`vK'r=3 ϵFlHҼ(pix2Q竭'vؚL@$3?ZYuYyzi`K~`=J DnXJP8"A "_l _LI!Jbj7 ONPXLꅜ֤*d_Gw꨷,əE] բf(Xk]jW5 I<xgs>aEXi Ylu郬/"aLxQB`I(l:~; ifΈ@=v̱,?.쉬8 <`ODv9N0F!"j\r흱|,dv)<\YWTxcwڛ~"Kԋ .n-,ӭFBj en- $4j$i139G#@^^m\~|BttF[MvqN$ Ygf ]]|Q7F$:znz"`H@q}w(oDZ$ :]@ A 9}s$qAGmҪWWMOz[kMA@0k Ĝ&weg(ENZS7g0 xGYs$bpIaz->Ύh0<('bty%,uCqݕdedoׂٽ/5-q ڬWVz~!qɐ,:,q]53kgRݠ%&D>0w h hZ8X aCxWeS'KX>~C²E_ O'AwדvLl8ߞ*BÜ ;0s~qYClCtc8f&n՛-XpRpi%PKb_oBRTg]lDզ $ 3I+%xdW7יH`#eI˩gpJ=dF3nyƾѧcڬhK1Yn|s |tS5IOݾU9M?;u ~ex }z) B}j ~Wӌ %P6D {)8y޿j%"2uB!qD)oHbh"_KFV!7#YD+'DDh^jI_PkJD7uF׳t<0t-C%k]D_L"/"TDA n?<0S!<5#K-Cyx*mF|=&LV,*S4]<z">8~Z{fE3 ԝJy:O! zX nO涀\*y .Jg|kwOK= J< ,9AZ7os jHeUFf4cϬhs]Eh / aBa47[N4wwDvMT-VDKmc{ښf q 3kWk!B?P5P|J[!nE ~zHgg;!t(/e[Yl꣧^DsKtH8*QRw@%mC >{H4I}Ho>~jXDcޖÓkIdmMu"5ߖnঔZ]zə)v*J{N0({䊼k XsX\]IylV%VZY^ `OΑ<(D"NUkե(4-y5vmElח4,)Q=̍]AC2ZT/Ч6n67m%}9#X1pFaP37vZpN:T?C}O:>pF]Ι7sNõv`YǹKGE Ͳn9o p?M5l?Qp~uܱo,箱'-uƮ}z$!Щj1u>Kwlϸp߿CX<*YQُ;._􏶁H@Ppx <K(-P7w Ck?T~jЌ #h+)q_)MqmǢ=]ޒ%t-|{%"YBa;r޵kB{k['nvIN067ig,ɉ+IECR6;*;mQE~ux^2u)I\,~Ϯ֩~ӹ>` J#0(㷃 ->3Pgs~hrut#(2m@uSP+8|?6EmiUe幠-]$@?_^+bğuUq\j?|y@ m {b?"<-RME>kwh V{ǰgr-ͫ= ?Ol_vr} L@mЖ0 [tL'\#/c`|"E@vFUp|zS풬l3G<]Ҟ" 8I PЦpXnhj tN9A#$ƞQpD$ hV{"MN(Hu½ #:5&xnP " ; ;v6m 뢀w-P\ѸyP3G^7f!P$20"!G .]K` 9AʶL,pLE( @@a7R(a'| gv hB(0]H܎r!(~6'ѼLB @!#}"vi I=\n" 現8B0*6T@ P0݁/ۉPiBŎ1>$u}0saI,l3+vnp:{Zxm:f l{3,.$yS9,3(wqwltƫ?tN8<;qv \q3Vvȃ-(]R?{DtBA:᳻wűQ #O:c-v=#1$f{.5Pg޳U+7p;RLRms dA $<֭vA&VCg`;hٌᱮĝ0#PnU1y) x&mz:֝ۼ_׹0I=$vW n' R[y@o歵$H"n=}[z[C;8L`?E0K|3Zo y0|R@} $[E.o0e/˯1e<40dq=ER:$&eB~ KLQpCb G.G8̃k+c?GUUVɪjϚՌ#2;> %%DҐ,\`*LĬC*OFwDz61Nl7rYG{{`qy '%2 A+<6&!$(\LÚJiI~Af%PjZuIs3]ZB(MXb1%8Ȩ*'ZǼޓكUgM\MsV`0)a5)y5uyLUExK G^j..=XP5֏G\R]$1JetfÐavdĤn>iX `*=yqo( E %%1jf028u ]7|9\&]w4/Aw{h^ǽyq{) ,aeA]Ә+Qs- ԕM|GgI3v/!4G&ǾtQ`D: H+'&$1Vҹ8wKsp6d-Z;&eun ;a\3ߢōO8m>׿2,aPtzdviyfdԟ)zE|O3{.q .2ZX14|;H8QwP'txd 9p8=?NC,96aXr ʆtxO{^=^dx<y,}'0Cre#bjIX^ڜv_Ø=.0' F~kՙftC0942hM#>4`qjԷoYoitQ/`qٻ7e3lבּ =bO}hXhM. 0) h Iew+5N3P$#:$KdcY 3a5"Ue6MgnXO8m(E0 XKlQƅyڳ)<F1">˗nlj5 a#=h:A{8v :ytӡ{B_{m|4Ƨa&Q]ڋaDNvry]ԛLW*cn4D6뺧$Ey?F9I{>崄s-tg쪭urWh ?mxaV&Q f82h)򬸩8 Twj:RUwE6a3J96YwۏV1H iA(]%nA(ݞP8B鎄mӥt>;t*}d]nկw;vݥk۷;w6Ve[˶=T#l BUB-u'yΎ:[W [ʷ'?P#| BUB-u'}[mA=v$ۂPP B G#[lOh8B 4\%4܂p{BHh%]9-),-0iJEySTsx4QZ_(ʋq>&B%.+4I䳫r>#eܓZ:yyDkaHFmgV$2&K+1bE/he7>g{}~6/xS]xAȪ7uS xFi5/Q94<טugCzrK]8j`kM<0)PTXd0djL@4ɿ8'~?OD-Ւ'M4fTnE5]f)S On,Vgpiz4W0h>@+ӖUl)fɌ c`("">[xVdN녥I5~=Γl@kOH@m7'6ZQ/ QdyUzsn ][sُ-/JI "\jLaz@X-g<9Q<CLX;Sʒɇ6NAQY% Y2e6kE|<>L MIV [kpo"1 AΆ[f}U?>[" ˧U*j%ʞDSVop/CUv8;G,p8Ytbj(Nb Ab]E:7EOn,Fs-EŸf*puMPpO{pZ@?Acf"oDȐrN#<' L" LT`;Θ0':Ç`N ]+.meҙ) N*bԨ\ &@]H1Ѱf.<.PpXLK?[/A#Ƚ`$3otv<5fd+T@U'G8LB#b?1ǧco!?qsmJ]t~,=Zcig w,ޞ/b_ux9mۣ+_"NˣD/w,k)ΨC^mо*.kާA2qz, 'o}`2sMfTdO3r`Ob(Ʉ|%*:ڄ\g(fkG"Sݶk6ԓOh ކ i!6ЫiiiIWe= wڃTj30n=VgIetD5i]9zز3$t޳ =F=i+mp.Nol"n5J:manmO=ΫZf*[6D k \Luv{"|UF.H޳E)Z˴x-exC̠ݒPO'^_4<Qf/"*d#Cr^Tl8$dXt qXqG{\W:,BΎ{oZitDmmBoB߾CY| G4g]R,= ׾U*yO  k[Sm0=GS ~#}yT9VJ-$C{eSja!%>ѐd"س~:?^3A8Lqӡz_5m۴' ZR2qqSw%m-I_Zy0PFWrhQ$IK yIJbYEyc Ce'>>^jA=b? ӏl7 }x~xw۰o#*_')'Pz㿺ie';>kL땱҉3ژ5F*_AȟG~3MޣI_Q[GtN5O,aчNC:d.Vڥ>6T*;N?}ۡ ԯrK4MGLTLN9?6}8+e^F)P@Ik!d;vMNHB:M5 ɋhW,KR bR Z"ME W.:?CgSedNVP:Fs2k)hbVfskv]lc*.DT` C(J# G3ԗzb@4'E)Tk?*8u6WNSnͿjI3E "Z1&gP IBǝ.@rZ#8f\;9 ggkMO%ޭhՖ(-hZ/Z|Urk^ 1УH:OFsץlFx<$*mNz5x3ߚ!^*zppa(Svu睭*yMMH,B Qg4B3:IgFgk쁣UpB(XPZ(ZEXJz!}XO]:|Y֊Y+^Nx9t>jDX3Hz{ {(6kz X'\zZ27OCfƨDA3}G-g7[@RERעA@(SˤPI[Wե}ԢyBp92-PO !Tn6f$ijCIo+y38}w xm:w<3Sϭ}Pd%!oұ]{M1r%55HbEҀpD1!~%'p/num'J ʶ-gQ\1Ή`Ig%՘zhoSGhJ(!ꬨǫxPr5I [U3{f^\=N%!SE-J:ۀ LP$?cvrG[{П D pj{쁣ms=~${ܳS>+E29sd9KW  "ARG,[ >FVͥpQ)bzJ_j3à~\X-%iQ_v *̻'zRxHiV!6bn\Ws^tQwʱ2)wg)eG=XI]b - hX\zAđ |(Uq H(ݹthR0euh;m* )9dV$CJ05~ e]%sjH-2m9m;vM`Wt=6Q?ns#cv~2+Ӳ# 8}Uuc!Hꛔ1]Z0mԤ@Z Y/3j::͠t;⥱BS5NaE6F,ezxo`DBwq"霰e삣4W_xIp @g7)cʷlT!5lsZҽަ.8vXȲrtwVc<"c,茡d b?WڠR|peЌ 1RN(\I1[ϕXpOS-YtK1,[``HU%ڏު/ kȌ(s- h2,%p! :%c`Kl'H:K ޿mN?[&ۏ7j6xh۴^ r %6^ԂD I3+/zjfUq$͋`K{ORfh>AkD̗abog;K|d-~ . O[عkK Ư;iҝ4AJÌ^+9*#4 B^CJ,&SGg,,H|ښ&Z"si2(oq|ϱ BrQ=R]WIkW ~{kFn\ U4AG31(2jkE"o,p_֬15Q9CL$<6w]p\tG݈8 մuZ~K ɓڠ2G=HVI2)F,ǧZxQR$`b2tr5ch,q GrG@zGCi~R?.8Vr:p(\'>/1)mq4OOKy6ϷU7-;ڃW`bz0Ƹ .PE+JLv;n 1n<^FVTBA)X.ͺR\Z~'q&Rۇǰ\,Q;n?^1:.'<N򨒯%ʖRi;rXlu^Ľn.uYn,PbM7>QXS} EPtvV 0 }6ϲpuKiu9~KlADD*.~}_V]%utDWZC9/YcY)j*jLEq<.8jyU3)dEx@0v2e'qe"L+, lj^?=r{Ô!j8nB@TӢt6_LǾX={ \";ז!< k\7g?,> Fh)uЛ3?.ƨl4%b9WI?Ĺ IaZ WIga#a!'_O#--R&QMx ;'wgY4eKEhD< Re]Jo ٗbrQ=<&ԻI4PTf_r#G\g:apP7Iј=`KRW{'$4f~mv??> xDspR/amue)Ua; ʼ(u:΂]"ud^Mߔq޻LFQLSxYy,(*ܟ Dw⪘p2OP8sNeT@=,}^'/=-7FP0\<4Cw۪3P -_^ރZ59ZXW! IʊG#ø?f_gI4J:W+O,Q]oU1xũzI>bFk'-A;Kz4>}C|8QwaɩH77@H,eZJK; l65?P92۰y,7Rd΄vFfry;lG"? Q5Fy~ss#d쌸:8F m|Y;0\}t437ߛ!W8U3,XҮ8Wayp\arpe%,I4ƹ7rbxmDZ(e-sYioF((Jѡ)L,QWfLeP17IO$JaFVBHOż5'Mz&91G&è5%Vc!UA9UQT-'v*=F.5|;[BnY4H%k4(\֫ Ҡ(EIkd+n|9&+څT֢[~ֺC~Cw'.IC0ۢ4K/ 0V`))mAMۯ-PɈ{q&j&Ju`Q֭]"F9P;ˏw;! B.IP핆: e-6W>F+ ޔ>6RR:bI;LTar` +M{wr8`T6*L`M1UMrGt~ SFĴڏ*4E~sI{7;p!JP|U=&]1˾yPgzW. $p<94wH)ʭ67Vc-f;|ND5 [nvvklX5ʽl61`h#?-4n{Ѧ\2b :dޣ]̅]`:"R'rmqX0{8g/V0/vCN]rǛfq=;#?+Hqv3foFgr?Uyn/vld܉mYQ-^UlZ]UBbߎRi<"4A]*UbFreg߫aMܵK磷Ѐ[$e2^@ߟ9_mYs'-mH62_4RP]u [rilΤwĉW|SV,_5Vt'<+Qk4N=C$ܢv~U-B*hlpW|ßGU5^Uw3HF́W=E-vpSJn/0`+E©Z`1ޑ\:OIvđ*Ƶy\0).8er02`vX W{$hP|cqu p[D x4C,[㢶@ :IsVNSOx X%\{:66X8"K Ή9@#FzڒtFtKv%cU&.`A` 4R1iEYXQ,G!grڒtFt[CY3y.Wm,IiѪ]N`Ϥ)*sB h?b2̺Ѩsp}}9Y<{F׃%ym/CߜoFJ?9[OSKHv_P#g[oqgU{ @4*Pz{PmZaƺ`)E WE4_ Y!lKqB0f f%tk:v GBC ?>ClSpC4ٙl4t图)ˍ͈sW8. [2G] F] &G6R|KN m #5\Wqǰ?@/1 j]+|('V$V= %O݌'Lv3r{,F_ҮNٳҊ5"b ^jv4BVZ ʋbӅ1짤|r}*.SiTrA6!~&$_./@?ӊ"xxuF'Y OF>6Xj 95@U6|i}yÌ"1e&QP}p QbKWw'ǃxwPP,gıL!1ֹGJazyJ 5yQٶ%Oz1b4HD^&d2){)bpr9iMvn[屨\\Y1#ωӜiD-WF/3 $LeV"tŤ˻F1MԟЊw@RqR{ԟ"DtGiVʍEa٠2V AF#cV&#;]0ԟutwQ5Ƴӊk>]|@0E|\ǙB |nyAL+g MJm!͝r$&LXe@xvːAS8Jz5f >ϰϵGjA! .g'kA >RvLa= ^B<aObY `6t|YKwHм!=x0:wQMU( 9.8Ý2 !/Ne}y5 yVVW/0 8kUe=F)VS<5⹓$hK C艙swnݺC~#0!pfy>WD6ykL8ιѳ40@ A@\aR?d*.N:KJSnB&#OQnF|rjĺ۝8qAqg4#(@Ƹ(P hSp\6Lah )P YRc\[lt+!rAe.B"u2nt5Ong#gPN+TE# W@1Vb}/rkl5@]"8me Zsy)fbcE%\"(@eQHE2IeAȱXIx%DA3PeTS8tZSHdR[ml\|G[";4";V4:m2#%!Hiq!rqpZ!3 d<3CAx|i#9v3iXdDVe1+8ʌ8Ktb,CQ/ $Z 4WbBYՂ24KSVʼT 学WdtK[ ejV@Q|FUm1>zt Wd6nx5> C^0jQbY!Od ͝VY筛iB'~pCo@k@Kh  h4 q[`1 _tTp,x2nyT՘WlKZmWr6yH)‰$eC2 C*J:yBDVd"I@-*&0Jy&RAhW@Eb˸MʤS^*(frͮR*Q%\!G0t 3TJMe5sҒqO^!߄mJCn CSo X_s2bmԃ])jeF }fiJmLuZLZBS]]I f4Z&E2?+u@^0-`d"O*K"H9U0P/FߌXJk9JZoo6HLEg ?^p +VUǐz:zgzUE5 C?S+t,UrIڨ*9M46lYDT7@V:MVߡ1"#C4~ -wzWj slwjM>Ѩ\|<60nLINoN7HF*'fƺSL~XBUUDVȣ0\8ƖhJXL-!O&X\ M=r5l(Au/mj,#Pp>097ǐWz >6'4VXx>Ր1>>}:6I`E4f.lt(qՈt(S-qr66Khm3*T2]~,9뫉:Wigx8k5>Ojq͌R:5'Vř&,l|Kx8ZmѮ\w&Vym>^CV^u$>'k[L+j KzWSrһ:O.ɓP wU&v=qzs+aD>+?5A zF1諳}濳qoYNߌF.P;v∱7vLRsÓwo)0- 8Gs]e{՗u,VJLTo/C5޼yY_QۡaГ ՠ>|8)}]]G/}oՏ^$12_&7~UR(=r{Ki$?ѐ$O,SaxN^4ߏVKS-/T76#eTRDjfԪZO֍]N^TV)hnavw}'~ϮXk=<:XA>])fqZީΨ}/OVCW;eBqELu??O_h >V2cSF s =UAzzo<_$W//??k^%ˬw:/n{vP8 lM~pj?9yIAvpuuP1/=}aBˠϟ /o4:*_kFk']Hqjpg۵٧i7z_a?'ɳ!v6u"9?Yo3kNjP#vshaMiv9}[g j)+mܑfyTnr{{p! OH`v$4ZFԘa4dm"pAsRȴ.Y*Se*iKp&*un41oXw$m?2LF<cc*ο֝!ڣ>Y@,GS?ě*jm7iN{gWneAwt-(IU(w%ه>ҏy=KtiO>}+8$_5?9=jJCQ2\L몑{{'(Jm[f].]| f芌jݶ-_{d SذgwwbÝc {LUy]WkZ-u$&o\mI`2pʰIZr:jJݾ}TSJ;K=!X!:GW Np'ȤȃӶHP|` kV<?%>hnXj3nno{rW~^9)CC[-ltv4<8H؛)deQb3U`2[0I*Rer+k2`D&&U")K[f/p?SkR4nzcZܲ"_tjMMW<鋫':SOz=BpªABQp1i˪O7ůzΎXzNZcDp[̋:ӉjCp?/{pm -h[/ƟwĥwIZH=`BjSGj!vjZHd:Ҫ~\Ui}nNo*ԑN;Ԗi:r!AMIu&K(rUJb↾L "ϸ4VPt^3w_!3wo*hE+@Ki3˃N Bvgw /ia %sK=}Q* 3m{0E~S^JV҉4c5zd-þ+a8 | <@֧N l0 x2ʚ|`"yq9轥Ɵ(ϸ$O&7v GÛ+P6?Y< <"B-\)^E?†]L@,hOl9ᶵGb֬.=LDx ҡ7xag,;72%1VDDtmU jBVT\p՚(,Ӆ>8=l?L^qm>Wk9wS`K p7e셗NW 7&!q*Ǡw)& ZdKZh߈KZl\ruȹOm\ޢ'InqM ُoS@L3BR݇ܠ6zvθ&{\Zee8,vo'iaVS|{U_qaz`Kp\ɴfDj#&{7T] 遯V j]R%`:(U|\RlU. W͘MRn}rSIT~Jp. ' uNJt'''EhoG{cn;6_wåV"*u1^\[Xꦖ̺OnQv7>+P*!9a 9t퍖"0(]UqC2~W5DDG{;#EpˠrZӝ~XG{;!EJeN*ZގJ{5 ] vUwUɎv7FګApPK*8cގG{5HM9`v^bd?QUUʴގH{cKDITw#^#Ȗ8쏓 :ۑhIFr/{gȍ"{P*nwؽ.7lv,c$AۏRӔVL[z`Ni?9TTU:oMM 2E]A2Ru{^&4\Hb[V^PUEhn$.@WozUx[Ltχ*t_VezciRǶ/dnf {$QnЃf1> yj} lt 3cXП$=E<_4k_bZfCӳW.RWq|N=2wӻr8|LcqطȭWC&ךZ-4HV ;keޤ99NepsPŻA㨌BMU2o HePvZTe2hX.&wyV',zb`7xzR\uw]uvG1pǚz&perCzvm'&h0a5w pzL,\M)W$OA`a:־OM8+񶵘i# aA >"S@W]/Ju?P}R.t'$Pj}MoBg_^e tӄ3F=豛r=6/aPy WPmUcB`RT7^ܻe_(p>!af9'AePƤŏVȶzU=c!;6c 2Ȝ%%I#cn; ,y.?}'\ Y2-tؤ !tZ] e=+̒s-y8O}]v.b0DuAs)eCs/6v S47kzijZ}91@YQ H9/3.hF_/={c{sE\-ߏ$ x@9tQ]XJ/*]EgfC& 590R _ؚRzTk*_^/1\{iA0#5VʌAѡϢݤrh睩킎H]_io=?fbwްDxe"60R ozW4Nʛ}QܞxfR#T8cp90Roą僽k"h[o}0|2_?5W >:ɲtQ?H)AzofuLzoKܙg%:`8,rM)\-dZ m8 [dlIvNgjH1JvXRi/r`tFcqրDoYҴr`])_[LP{Ke+nеgJ92a8 ꩬ!iq[z*J&? b{fr@S[r>P0%ӴGH+{ Y24R- ԢSE;zZyOsa`vc՝l0ivJDDB#QAD fM$KGͼ&uC0p0RB=)8Эpx&^bku4TֆF~oe[kX|9@]3$*5cΤ,XM)Vks Q!Ďf=ZfzB:Omh؞%: F XAQc7ߢ6w.h@\94г3۠m9;4W4 xd0^[DI#` 2nq.m_39ři?d_00ܨ(T |Jf)Gl43W]Z88ZQƲ' +:mvV55vͿsޮqx$5l]#9d)r=jc >t칦?^#NdkI#x( %,]],n"P_[ğWo-RrbxJfeau}"퀐UEzr1(~pn= #OoѬVM8}KؖrqY#Z7ys).xAw9ey ӈ뤑rJ+o C->LDG{dESsu/RB }oeΦeõnF~r(ϕCVW2d4g/4}saềꚞzu0ummVzKUW¶:y*~j;nj8)f 7S:SFJIBGΩ\wN+{91OD m~&Pah/k%dV<ٕ;BHYthNGI#  lC4>!!;xF\Qbٲ3FQjut4:Rr&?y{^HI+_$UFAҪ !]C6^,$eW@ʍ)CdjH9ryQ]5XQZڽʼnAA': ;B >Mb@ׄ:($@{h7Ə4nj96R(vP\N~1!6m.~pNC2!`"Gko#^)Q#^"A BE@ivԭP.a̴bΗ1X9cL1C䩙uj2xGgb8V*(nҌm|>[z~8[h]^^\FTWw7'D!^w&ם3 [P]moG+q6KݭI=b":i!9$Eq4="KUOUWWwY4a..= Ap^!$=J*~UAFS]];J6T kO*Rr,"T3w`W#m.y9J^%WdYrd8BZt)_ʆ1#IG[0lB/0Zcg?ޕ@I%Ʉ :c[Qp+C !'Nq/77)J %񥂠JT Lj1$Pbf5H0QB3I}7o[zz >9ȇSϳk0%Ji䣐ƒj` !z590~i|!׼mOOo'UїH ͇f@J_{Щ^~l~?v0`ݣW#,̧ o*(R[N5鯾i{0E Jͼ@m‰I2b/D,ӏLv{@{߫K|; m uLI@O\ZxreRhB?.Ur_M,Y+쀿Fǥ%7Օ$r2ro\ qPIbtHma樕2eL:ÙS0- iAZ?jrKKQ?DD z4py V '+a`{Z"x6}"Osj_c/"N_#%di~q5~)~)Ŧg#Xb1J-@RjYW#D5+U*RTTJ~RQBSp6،hnP*Հ uR]TW*ՕJͩRc)DSŰ`t!ךj 1G˩4- ${O[l]d`-VN5ꃧׇ%@c2m4>xM)nU%JQvdhZ#!E!`MF&)s0Ygqf+%p:& |]<4GGoBE|xbqo>v \ AoFyN6CM+,hLF*ZJ7\V *jZLQlj]*O~8atK/-zoÇsEx/Z =V$ܖv{=1m$?Cof8A8EzK~<߷y?"osYw5⚦/Ÿ_m8]_bk)N&0DLhMرj.H`eHs_&#?Ɔ駤E\~Lq|#LN|Ē k5>zYJ$G|zb2[&^!Ax0; G{?ISL$D 8#+㤔)lCS>c;ˆM`mKr]NC\7AIp jVt)=<ڨ =⸺jhf@! Q1)cKӏgf9|s;]ػ~ ȭ{ܻTI;zU,c"Jýi02||{t9R02\ 'W|#o||=REzTdT{]eIgۚL5n9nkmVa<)Їd`a(hl; S }dFD-9Dh=cHDMpi\+ֹcD3u1Q i%T֪eĴ]-aT#~ e4d݈(5g&05RujH2m5KU9LU\ח'} [仅(` vl|51\eGvdH͒\R,+ / <L $aSWl7b"# -.Io:&IR^ݬ8YLHIbIRh5#h`c$I;9F0 $fDCh#|u (vOYKhHKA~)z7Zlpq湔Xg#iI#\?.nw"Ԭ&[IݬJL9= nR n+jk&%zѱTQz}zT< fS x 9p]O25eۜ?8Mޤƿƶ#1ue'L,&M#@{YϾX]e]$CJ$Rddc %5wIU;ϣ0RMX3s[7۰=2oPyۄS]-ײ5PDWtbK{Qx#xplFDQYLZ~EEp75h,Lu+h:p 5i-Xtr&eIbKmКӛ4MoJ)MoҚӛ4Noj*$ gp%H'$v*1U])S'i<ᡝAl#D" !` !\*oۄF+(ť5NL.w9wJU7m,1.\ٗcmNe^^о]0L2t)Bm45nz!`E u%VCFh)bN /tF|sw>rɕ`p.x9hF XHfPR툱Sdp4&2~re3iOR\3g,(:zpnkN?2FgBt1kh\37i4Qj<`N~TfOe>ok&;~"Q\> Բi[ܚլb$l*U0,bPGNPf ^5uo0E-J.nwfh5=oWP҄#:owޮ)J'ϟB4'初 ZAU5±}1|L gh:]cs,lif-h.V` M,r'>:T`G ?}CʂϚэ5QD<!+O8+f>ޜ \9*o9g?}/#]!X^KIµe7.u(}:DԽWRsEͯ@(%D{(00+b.  5;K75>gX6tuJ+ Mō{4U_n='д ;-+7'?_mU%m@uKRt[31a [?B&\(a8&Դ-0ĸ@ x]󖷓oVh*tŏ)fK/Xg]n{^7Vۻ/Wg>)5_Kxī.Q#Qb=?k}ýKݚi^VsE'Ñ2#H0"qN3B&7(ӆS⤢e" NbŰYzr$3St&1K)2o#pi( nt0v_&l Fr~% TX90zE22,3̙wllE˴ˆ(&Vzo % La-YPƌyLNˀip؄hj4τ̈؊D֔55d; fK!pn]Fx s$p1DRKx'(q- j  1fVh , e1,+lf I?;`L3\Ew[:#a m@DP a-QB@;Fۓ7?}3֯6)r - V` d3 XRg eL[)+>2i>oncX"&3 9sȃ+;vR(  A*K9ZB(׀RߔN i2 VPXp6PL&j@ /-V0!ЮhYc+6AwVrE`B2n`Ɛ)uP0HY`ќ=i]UtFt[s6 t,,0a?%Q ` 5D)pPR 3@ &i,{ [:*];D BР̬%J92j$X@:oBb|)Srj]&)"#p 'Z*k HtW(&мGwXZ"A tfpaPdr(fĥ*ʊYdD1&ByUa:i !1"'dq݀&^8>, ^Ztjl{.d!"ka&c[o,yi:(O` J`#eJW=$VX$THPj :#.#ȃVsF>@"5*)eRȆęSiqdAL}l(kvk < o 7tX:W& :Qj%wo=<>yP'KS}ELl2f,&Ј`PǠށ]jrk_({e6 mP/AwAK }C az;t ((\L(۽,*D-hg[R+u @9@ Bp5N%!m!^9#'˽eg;VyB0@,ʀGJحjc:{M 35VEBiv%&P?=(D"hwEDMk{ByWB]%iY&JFNX-[ M{jY!b8p5Ci#b* :jUae҂ӆ3^\<R0##Iz`l9 JR0w ҭE<p4T9v{TS.*w,(1ˡꘕG`lr)xIFi:uAb} \- Pk|&*?!tGFgK6B{qrn j)STXqqTgVy?tsIi} ܱU\Hfٯ^["RŲW =J:;kd|Ӿz@YOn} cn>t^_޼~*7|ȶ[VIr{Ήqê=Ǩy='hoF%;::@Q7iQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQgu^Q':ܜ:|:+:݋7FߢQ'(0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ΋5E0upp1unlQhԱ`I4:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:èb:|ge;߽ *׸}掠nAv_zO}~W٩\.`1>Ɔ"oX0ںh&`1|jB 5[gO?Up~0 WLHo{ +k3rkX(\ڙeRLBf֓ 6NsH&8fyGJ)5Z!\B8B],gE3˲]gV67(5I(ֽmAf\Fְ:e;Xb %/,us h&`=;`sv&`#;͇e]~GjLlwsYU Xh.9\;\ 8sh>;؈p aGf Xql XE}Xú6Gd.,]^-sbۣӏVG{}8il} YַG|.zHq|ypґr-t^.vهX:׵_ρ?q?٬kÞ(?Gê`>cۣ֒X;yН`v6$-%^IIה-捿s0/IVm7slV2x7'~>QGd&VzzBVInCJj.!Hdw]pq.DP#% WttqG*j?k>p:`MxewrāeȺ 3뽸f6xv.1AASB;V+qlgzD\Frgf;0plR4ld\ EAVRa*?|g^+Q"> ۹%n~Xz&`]х!{1K\Ꮊjr .eC*'M\~CpĿԯ~N?-a9L uZ̠Wj'O#N=y\CT:%bTKlvk^r XP虀5D;a(<vX#4 Xy903lAfdcD:l`rLʞ"35Q XRZ?rpuogVsAP67QI筃 WJɡ~skiOh޾/{E"z#p k>⟺h?ypo:? U' z3dEn%I _mGo݈pݿ/l:oJgf~IzvW5[Jt9˴.'7R|􏯚Zoo:k?]^ջuߥmΖ%oo/\~R'\U:;^Ww{gss\[!Enm+]r4foR~Xvz|߻1hFwFQ8LY&j#BNy;U M'RQż@"z?aX& $E͟'&?rl<4y 7 qz #] d(PCo[K+9r ҥ:]wvE:d5y">}߷3=TH-쁛15(:)7luؙ;hC>2Ь1{!&VP )jJFNoUSTfi)7n]|" t Xz쨬}Li E| ۛ&U%k͈i˨wm{թm4qD 6Fey~j*zg.5|gTM疥M`R5ysYUW-wrZyN5tVFu8uNFb>i^b&;T)1$rM NԭhE>{(R.MvJ:6JwbP=5 }m.5뎺a&zS0_@{& r}W xr^ XTl`t[ xi簮nlfkmȲŴU`1Hf;d Z"5"e;&EQJؖȸ`$qf[[]PI;80QT,YV&SWaNm(yH0m̭ʤ7icBLĞ qQu*}5lsLNS.0+a*6N0˴EPf a5F4 2XHDl9 - TAup_8E-uUO[t:K& ̕ q抩dJxGX2N@1W*>h!{8+#82(C/^e[`,3(_uȱ4," l˥trXF/`B]}ԌɮL_:U{CJC+8f4 ao Db8hQR "!2u}51EQJ'*"s~dw&|cP*0IeSI0δr*% `5*"d(Y) V%ΨD4GPFY LZK(fFX)R6r Dbk\\JXEHI}%ep+3/=ZQ($WE OH5 * Z007 3T8$:Eye WE7 Y+c+:c"&l&23ګ튮#eiŬR((c pQIbY"j9!2uk5nU015y<;k^52^ \X8f?z}ÆtdUW`ruA¡%N=h`"&8Cf"RE aꦃ*  LĴyE,C`rT`(` h yq `}QPVǃV&x[ݢqYɂN~D} DU;͊~f[Mv2,}3XAO#e[!N‚.&jH4} ^{@#@ xvmKG z1h$!]%ྫྷbT5ʪ+P PjLFR{XÖ6y2Ze w0 <:հ *N6GݍQ9: hX%Ya֨4r|p"(3ZckzDI8KJI}#~ЃA2C #RFdCAisWs΢b! ӉTIپ=`9ڳB&0QI@ěWԆ**-={_Qz7h[@ 3.y͎CSPҹdM u0q3Xql­f|Y״tRV>̓Ǖ@Yƒk.nj!f=00`-> XaVn Iл<5$#d`9 7z(Ǥ;7p|Pk>sGk80)^"o*(9͑%tD(%\#6\ Ec˄H R`* GDi 'O L:.Y߰QaM$ A>G/+"U.j &U,:ڰ(RVGϊ98pci 372HkШV gF0(ΤX LGjJҸh 'k6q~5(Ws Z>@ Mg TfQWq Z#ӆ 2m@(ťpHᖕH-t9E H+B)#<ߘ#Kx>h)k2iG!m(1tqH*w+@/ah .2 79mِCL*V/:O1`.ń`ZĔp2"fץ t\NklB: ek]w*;c70&`j?.zF7:ڋb+`7sdN\~gQQz ~Arzcqe; ™u1) vֳ~[}]f vi=yU<ѢI:nd>z5Fo=G?Ngs|~6;>:¾fÏ=Hh[bg3S<_{L;'%W^.Bd;P:뢒}çTP(hBoQ̗&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB8)OBg55G<zBgZ9&v:֪&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB}pf$`+u"!PPZل:ߢPG:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:M?B;nŻѫ Jmw/<[l|'?$ѽ^cQ}ʨ|OOG'cZY_7~gVܕ|r`hͺ9hvqzy-e]CcoQ=.H|">H)K `ցru `^_37Kf=X!ׇҳR* rp(fX#ՇҳV!X':oa/̖` "/Ld˥^X!n_ҮROJO|Wn}Hlt6=7:_O_W?Yq|vT7(bYOиO(y\g_Ǡɗ9q(y %PԡWJnp푻"p}8wEhwwb6w +X"""F컻"o]3X u0>+B]Jݕ3{@+w0Y"v׮]}imj >dкwW@XsWߢ3}H x;7Js(оTǡߍ[}͌Xjqp_z#ã*n_]=ڳvwez[:F*B)dv3gOي4/v}|EG<٘]+oeaJptOLLg?ë C`x+ γJ zLOO3LoПY_eօrNpon~is|~?w:_c/?v9Ii߽;od:lc>v~ I}]`˼H0Nɫh}1z7I/bG)L;2_O,4ϣn#KԠ.rx+??31KWg9}qG3Ne<֣4 gto~VcWE,wj%DI; 7 kr5tBuRʸ8\;>jNv_q䋶';gjS칣tGX{ ַr*Ewک|utJ}w_"C{,ڗ:&Sx;ߚctvFlL>uSlr-su|_2Q?o#xzdz= \j\,,㐖GP-OZR!H茉pu8_UtbuJ1B1͓ſq?NsO>k nNƿp;z½O|q|λ*o]S*nK FIO]p|I.nvShFI yPšHu`GUduE$< URk̈́~:5F0kZā姝Nw{Guu(S[:D`Z2D$AFYA׎ ;o=qCj2pi{&d_O?f)c`6W~yܤՎeC~q|7?X>Gc|{+\nvcr؎eã,6|f9tUE!䉫;Sه(I18Z,ǁe`c=rj/0yG_CWk!xBhAfUN_,&aRJ+;YYKڑ3A4 SV_>FUp"N;U&UI䕎ʑ4BhK#P575Pt6vDyu?/qrӼOda[ґ mZ6TLmYnS]AT;0(`M{w܄_z80MUS [j"Xn#1T+,:4}~DN9ND (UͲ"!{"(45J܍-X۶eQCYsϿT{T5V"}HNKӬƁ-LiS7ϹwH ʛy`WvAOǸ0BwD]|os(஝{!n5oֻ%9{etn ~0}Cw 6ajZ(ꩮU`悑#&_B"FE,s(a'| ,m(m׫b+t0zs" qJ. 3/I_}f1:5a)DNީb䁫#{*ڳCx;h5zR7HYF| |@.hDwi/.T 榳M-g fPZMP}>R#=5{D'mŹvђh8aR3]顏 (Qpc*I<JB@;O(XQչ!є!pKPYGLz09ssB__갿N3k vF SGՔ kI`^,`VW̺_qCdCT*IR-5:PMm)cF#K8ڲ>FM'CC t(߇s8WZ _AwMZa+06P{[#' _a14]D#w}Z>]"wk"w.;긞c;c"N1ez VP] '^~v>! A% cˎj!{[wtQ fGQ=ƻ쨱:ɻsNx]u#=21$=LbkG^qIr:]-q[ 8}Tq"嶺 kh ^8(,([k-75% ̒,y@6 ui1\iM%fcl[NJ:hT$NMD'gU2 ,=Ӏ$fQ8Lԕ4(L.8y hKm}/Q *4MVTOWD{"΂$YZh5cܛ̉=FAh9(,I꩏A_[k@\Vp 9i 8Nl2bY ruާ#-Wo!CwC%VƚAQ2FX#P5 |(õ+i@ *€,̖u;x툭}u7[쮦G>_X3 HI:i#ѸNhڮ@;^0KN bh޽p]YfbAamV/mFLJG!S$ÁxQ6UDMC/Ӳ;vy2q9x0?HH6AK%J na?>-6bZ}l'y ?-v(o_wnS܏+=f~d/^K}/t>Asio3Ľž揽۬iFogv\8|trqe—]NPs:vu,vAG5ŗ}qY='ÿaJVz>Uꆯf:u}6 7Y@v.L*7?/]nͲgYf T~>n~dtEo61/Ri#O_:T53r'諺j.-2ʧ+c-ը0nnAH;/"oSDw_HOdƈO[f_iсy ߣTP*: P-,@RmԼV%?faهFzF Lb/o_7bEL'E[S<^nC`fTCӌYb3| 1n:9r'˭*jj%祩uVLMxjk$*#MNx,w9ǟdu>ҚtxIi>V Px "sI$l4^zHvR%*JQ*J G9`HBeM/]ǎZ%%͗<OOnP;d5Ft&]ʛ g;sy1#>,5ŹmNtu*DA`K9RS盭,ճ.<ߨx5|( *K&i\ kjn,M6R{&30}Qt2?ow=<14 .1q;L7isUl+7zkBn?Ed?&N#i;Ĉ##uyuYnς\Cs]=B1F 7JR `<xAbNkbYT^XK/Tbu=@Z3Wi:8}.ͨVw)wN2Pn̄RVzm=׉SDX%Z aIb P8DbsP(fF7lܢ5m( | KوDs|zN12>#j }J:ej^biT"~(+v<.Bfhj=HD 75>G&{[!'t+ߪo=rP- :h@GKji/4-6RLJ>=0/4 |rƦr̭+Q3^O; e_`mm%>g&0W ߱h"Mibm4>cf!_϶rε. ڄ0\uEs\I{+]K1rÝE$Y#j:hcZ r0AeÜ{bTҒui+6Q2;h7SX7ŸJfs1C"h\\?R&P<'Rӊ< "PҶ3 ># *UeW+S>$PhVEI :m9jIB5aJIG {MzG;>#*z֌@ j0%Q02@_Jp@#mؠ2i\!H -84(qŵozv-UrR:0aؐRj),+ UHU\#Γ^+rQīdJV'-'u۝5*S#Jb0 մ3%kjc68ߞ7=˨qH-kSE~Q?A.TRi0_SctsC6Tֵ"]D^bGr;Ādֈ~]kv,͈ٳ㋇o,Qj^-7PJ(0SIZK*˲Zcg#%܈g5-V=2&KFIJºGU'e\mI '9TI5NT2A!p=y(,g#VT\Da1Cđ:hULMg[֣@QJq:1/r]rRhi҅ h\Vm | sx5d8o[*~@Mݵ2%d.^8ETLPe?bZ\T'ׄDymҢ$' (],|SYSဋ>P;ZW]y"m> y&=2#QMx;h܈Wi=ޭö#O89}Xʰg'=`FiY*1ALrnpF'k0MXvӡVr)AĨ#ڠ(a'^yb(EmTWT8|V+D.YbK J30:2"KTUj:泵UJ. ҍj_|Ԙ7 Uz[˺1fevC-<]2FzgPcEr3jVsќ4 G;`r΋*a! #u+AReS {`y8}3fQ>b>V߅}| UE UUd\Oth|m |(Ԑ7~&06ւg`¹t|C]M: tbT)0nmp,qFEhpذ[(l`pK 3Du+==3+J,.ْleht%UYU,UsL{B ZO}UO.ss*)^xp(oG Ҕy;lBuKx㊃o""W ܺ+ 1q˔LgHM9* X9\_EO6É]zVS+#' QQXC$Vta'*/!}P 7$H :e,fXy"zrLT ;?WB B1 Vݸ ?J*FI=zm7.hAZgejAr K(PQߋ&{x:z{#Š n`o.pJo#ML+TK 1@ۀd<Ƴ;r;&ȒzgzCe7.a#`jQ(u,2d0)R2z**&DM&ICr;K<1j`9h-M)WHP)Y%Ȇh6fjy“m+#FWFPJ Dj޷\s h4A+rǬOX \7dC+*ծoLE9Ps0&ՄJPEJ uTetCTjǟդQCsӨajjBբmע;d Eiqu/ŀ>Vw,jc+[m_orgq8`Q=b>QY(-o! a>[/Dܡt#*ONXnh_bmRQJW΄&RjBYmJL:e0O7Č#?W#A/tkaɲY6hmVpTg"/۫ƉCm:wXs<5X~XE4 ϔvXF3Ʃl2 8NU,2IP>ˣ#dhE Rζ7V1M LCX<7sHšx \MҌ AP%- 0Lrp: I{}?/H%݁g G߶/4$`8[d=t擦H$VϢD`zgqLlAiqaĥ⤅9@fjyR?Pya! L 0R+BQg&25R+&d(3^yc`l p%3#SDQ,:-\k=$hJ|cRbc`SX&ivw@n|Xqp"==[ S;l5<^hW0.¡ \&L N)Wr(_Bzu͍LR!)&>Cm 6>_: /m.Cފ pJ(Vc` g[t '4+W2!s1y<&?XX8;AWF8=xH[ŻJX8|[g/ [9=x=LBJ{Tev$?uZs\ ˚8! ~SFvZX R9k*:|5ZQ Zh}  ctV5*L5T(6֬izs$j%4K0mX[7V¡XjgA .m %ѷ{"睇s) z :GY0?)"y -vB0mmpن*rq]aگqCr'z:5$1A* #mӭ& |h{gn1 bqUpmO^Ҭom*aKL/ Е<$@b`c'&k <0Ǥ`aItb[3'zsZؤ=0Ӊ\#M0) Y{W?&,gl0gyMGpQ2r8;gY:-_j@Ϧ{$YQ b?U,x<O=6oޕ)S!m|XOV'|f&WCbr7߂> 及Z)OJ#XHX/T& ~q؋#%"iymݝ YOºS(^f>0㊊zalN&Et1E=8L$.=* ~,GS9V֪܉A:ͧAȥg\L8 }Oӧ)&6fWz끸9Κpq,f*+h8ߔSKA|_(\2wls΄=P7rl1]n\+IpVo+R~:Sd\Ecck4)Av9`as@ ~YR{ FkoS5qʪ,oZGDj2;˲0Nٵ۽onlmkm[Y5qi_bQ޺ː{wGg us)Izog\LbY!ANJ"Uj}8 zfΌ7ނl`u^pVԫ7:!.8n\dߢYEpci|#Ib.gV&ٹqr}8Q41 \om/BXf6߾3OQlz8N2B|c}e9%k i;3mǕӞH%ҿ^'ymdJ}L[}DClQd. 桢wO`̇x;[hԣˈ% Bi mdRAxu,m-C!݁?)ƵfRW{dþV_MO1g* +spS=^>{`fTs﮷=t)~׮<7E_;řq0+&]`Qk@HP~~;GySHŷN qtX h`B!_pf%u+IXi|M~ķ$~T N(4Q:d<) :.0 fQGjjFKҡWQi$؊@Uc9Z-|=R1V-H[(q]\>|*=c=Pqxbq W)ltE (7uA[0#OA -x8 ]Ռ(jU "*t#ч0y-Z:bmEy'0xM,+0bMu4vp1%RuvbD[6Ƒ-g׳9J sҾ`rJmޱ[_?-v-uta6PMVsaw ,S?ZMRH 5"/f/nHrbjwC9S۟W3c#%RVHMa}*#y.XV5c̳kD xo SU896%<( d-?vV[Z:Ϊ  4S/_6$y KHj}i^i([8omH$O* Uш|v{skl%I21 zD4}9Q{IbH-I$K#vyMGNAۥ`N7s>IT᱓GLPy'u2Qt{lzc Zqlv}U.@ݷxNRg)(Tۇf y6#+@>Qw?+OK%1Z?xx0cNR i15Vc"JOD ;‹2a7CcnBK4`íJQRp2D &Ek!dI J,h |b|60LJ:#Ax[E:b"C5-#MI,י -}FgR #vJk!gqoiFo'//-JN?qPK\k *.~EVH |Q"*Z1|M*琄;Mz%#^o./' 赱>g]Hh 'nz9 "[kbN0{/_?)^EUe檨XWE -WI*އpB0uqk9(gXA*1! |T+@0W+ Q&ky>P{v ST*j NnZWV,ܭmXͱ2nX?:Ή==`&IgxsD-ud5@,76',KG6k:ɧ&_Y@̌ ">QתM̹X1$5>`F 0}F҂[:ԵG-[ǜtiwo }Rc5|]!ƞ:lq _XTţӏCʩi!4 z&*>߽g|bs_w|}xL=,]'*uq7W"lƶ0';HM}o.ПRg$2peTr$qū yeޟOa[S^])}(gc*^-յEQZt:%dYY#R+eR~q@Ci=3guakq=\̻6SiZ~L)^Oͧ4;hd魩* {%8,qp"3(=K -Z:bY c4XY 4_Ijo DΓ>ΨO0wiYPMJB .<=ná-Hx[a?{'nr5<ħaoqD芞!OSJe8Qs!҄ڟ_yE- VxF$`ykTĘcӄOQHZD iy 5X..a6F-@$l2xu >DXU)aN0#vm`2[{``7MJv9MIJZye$'#!ٚ`Wp|\xo8t(+-(GbD KR)Y͢t? ئgG=M9zEGEG=P2XLkV%+aRLQ;,gEN> ZB.5/V!d,4Ȕ\Nj *{IP)mATn30M1e_Y?[8Y?c|b{3$3y*|%Zj8Qcݮ#^:<̈ކ!'"T{mIB*wC6 /riEkiQX@D,TV#{!fyCW:©Rn'|_׏d.)Nڄts|GC]|j1<{Lv]*2S$U;#IŲe+'cܛD79/5G;K!ޫ럣v2,F{FX?ܯbz 6ҺIQ6KFҖ8Ö*e dN?vfX7*&u_#q.eCU޾kL/mwQmi 1QƈGݥHƹiÄjkYApPcEE09&.׼6AjWzKMcUec 4Ԕnq'UH Fg`>1蟿4(A01m"ޫC\  U Q X]H4.1Aq#Q| -&H$j@1y0jR; 8K]soRuJbD[6F`"@EZs $F|!"Nyр&7Jp7C5ͩ(inQnC!Ni팗>yp_}}Wo BMIl٬m͘{6D[\ۆA!8Hr0Ƽ!CHmWPǿѬscw9⽨K rHqV(:yE,w<5Ddxll3 fb >m?3iɺH˚ƽ~SKwJ4)< 1UkXȏvaWH~<7}( &_{`CD|`Y(",b9w txOa6PSXBsG.rREUJ%K9EVך '5TKgxK  TgL{~ q1lXR1gmuDxx#KGP;NJ/1b=IO $t+.\~8:avƑt X+)#s+h .Q\+(]G#FS QS^2pTCB Au3Z˜ѵ|/0e4`YJ,.g(TRV"v r[?_AKj|p-pX/{:t>2 奼hÞ@PA ǫskC#(qI{= wVDNR:(-v@~u&U9Q3@`ZMު> 4^v#JƔ(m{h#Rfc #DF($qzhC]dQ$|( =7KEAEO}|nC1Y7+)wU<@iQC~:Cg. mPyIüzە;VsPge^_LG;sV𸹨 IK!wk>N(ŏ!!FЉ[ce.NQMA_0Tt[M'cX`' fkE]tRC&R@ +)Щ/!E<]/xRQUYah "4lU6ЊzR)P`uU0b4LTr֌%>atX"LF'TO|!Vl ئ[_!0{~HԼHmɏ+yg1lowcvi+˹o;ijFwp\D` HbqNDמ[*\3G!`%>e,HPDFg_Zo+?!氙?LJ,=,`ډsm2|1[e>/?cޯ!~*J({U- ,dkVtWMl%*nZr1䑯?xX|MK=ɻ'^htj Nn!HV'4xU:x-ۧeYc@9{PzTg iyD.ծ҇؊oW ܴi&SŸޟg21g/ƽ̇ 4lڍZ:#?s⫝̸wJӔ9<S+ki7:@=sCBܑ$DaRݪo8Ξ8&6V NmVU#d/@<*ӕ#u iSZx͌1BaTn#h4ǗR]=&ʸFYKJ7}jє. yp#H018ZxW79ffh_Z$L+hdJ/g-Jc9/mb`BSQ5%WނFfp$ SlsFXA#/8 s VוJ]'2؇_Wl;}1Vu;ʖ9/UEc&#~UF0R^IM; ECT42#7ߟM[q""ln|UAжʴ ؙH0jrͺm)򜃊yLI<>3$2}? m2Q4#d/*Ω洠w13TvxyLЦǚGи?<5g2ABH#-px:0ZGwۗr3_kBB`D -Yq_<6/ 8A*pNtc˗I1>B .G!L.5+hdDU2Gt:Rӳ啢 M҅Vÿ/" uybj izqu.9.\4ffgG{}SpkN&+&MX8w)at&2}jV-'b'2͓RX峠^$?Ep*^ BVr6ӥ'$nT2qqL(V Nu:SiUKryԼ ¢,N x V &tU4= ,غY6xD+3^)^RA]*\iBkX'J Jgo 7qb4COE0m7~̐$PCsu=)| uoAx._reF t?|jQBb VX@=7 I7KfF ͑QXmBrT Xy!8I-wСv-%qeXD/қ<;^Qrg%b`/*q! sdYJUC>ڶRW NӋ6beRSzTͨ2S59{Ӭ) Ip]d\Jf2vAkߠ ZW2֢)^A#38 bgmML$IնfIM|n=1=F'9Be͌T:Έ9 (t\F\SOFnv84 x\v{ l9˹)UWǗM{'f Lje6cW={`W`28e7|)0uXBn3!v< `Apjv}^mj~p'5:0R/N3dxL`b|r/zRS>u,[Hg cWlgu_۽?uo>E탺w| ]i݃\>_Szs׃q3^ҽ1aھy݃ػ\wƇAǫ?z)DaU~SLВ& Y[=}H4j +^k9l20=Op^K (n ܄5~nq?oɇ"uOp{ SߏT^?Wa1xڷvo_4s'4.u ] ;k0Y3ݶZV$6j+$a}RZ- fe1(-.,ZVӳ^az/wo:bS|1gG H-LG֛q{"(HOu8rl{՟p;NZᬇ@37ӺXf#E1ĈNa["auS2\[wB5/,\i8zz2hdW_7U, ;w{[My6}k<S=9GQ~G޻_'8UF j_QI3BQ됓i/t%A<1}N?&cPK-$d5RK,uʗ˒q~(qJ AB){89T$4h$rRjkS0Z=.k%KqMX ;os{xp28,a$&A7ƆnXSUيQW~UZaV0nKCu7?:G;H2[F`c&@0e Ԥ&V9+;0+U'<ꏇoOM(= %)07%Ԛ$BɸaT3j9{ HHOtT2oGg/A' כ+Fq Z5V%"SYAs cbByj5D,`zo0{OIJWF S+xVؙ2dD+È#0Wnp\ew?ף$3aaB#4@qa0yAJ1d`3g}RL\̩/rܕ†aT K*E@†.81%bewY?Qsr8wHsZL *<wE0hJB P ac=@ be>\ﰟB^UJcY)SJм*Q҃T3J+CL!'ZspQ+~ N)TGO^ޤZ3E%%A@UD`v| +45Lܽ)1+)@ZlTIJ--k扇],"A%U)`i?nto Fڻ;={o׭&\W*`?/Wݰp[#5<[Jͅ!T/,Y~<-Bp<ޞ՗nZ7.K\_|\M`z4sqTr}GXfJo"=}G'O~5Pm9 H׭}85WW/3f4wsJһyh| ϊk|t9MFCs&-uWupXϳ93 xpñq''\߬u^{׺6_&3 g+'̸6gLr6\&wHQLIH_d#?nuu{qY?ADOF/Fgˡ _̥=Z{}?gMsFM0όu4LfWb͕u?9'*̆\RC)Nl:V`M>M5"^!?ř0-'+a=nσNuhx穓~:(m6O:LGλbu.釵#h>:ml&׺ '47ظ?g\ .p/;?"3yw>CPwuz11DM.Oq6*/ST┪; =.&~SyѤ?B:2뀯Q| niؗXb/b^IU7矚̲^$}Swش$S8<h7JǍ;P|7lb6UӚ/kgޏ4Nm[l.R-1?5"f]ο8/?F 䫶``y5R骻kb$Rcڜ0FfySCm6}Nioo "F=} {493( "_0zpKH{L:9I"Bb17ԕICJDx\ṡ#YD.,Ԣ>_㘮%75ss\bϖH%l4["͖H%l4["͖H%l4["͖H%l4["͖H%l4["͖H%l4["͖H't ya6/ټ0fl^ q=[Vk^Hfsn?WϧG+¡ Z'jd5( ۚ*jkښjkښjkښdk8NSeHok葭jkښjXē!{ܟggheK^ ^ܘΡg4P3NY.C$XHiQ&Yѿ365du{3ֿ̓~l?X>ԏKD#,QBrr%{ߟY~/p`I&{s4l^ü5:VEF^40z!4 6ƫ^1 p׀7_ ~+`Laj>9ϛ!{uOfr?녨ʣt-NwC)Pk\z+}A|\8R Ӎ.KJ y)1Uy9Xc9:()>sWs5L3gsqݤ?짠DϦm弞,c>*X#I 3gC >,WƧ>4h:@2 ֗%DD=#Ep$4`ݏL0`wnKrv60)?\`.KE+iu<N0KZ <&|*]%PVyj8au:1 *WA)UJQsg~Z0.9`Q o]~Dbe\Cw4d9$q;el.ij}z74fh>%})sخ+M-#w\i*{dDͩ],1uɭo;uj[FWW1y;jTl J\\ kzף\^>NF\_ :0vF?1`w^3$\#3Fߎ qOýk}:ݫ^w̓~@y %GV(  l`F m0s[g^%P˜I$C0qL&7ܫ mcWE$PC%|1$5ZhyA*\s3/A4I$-Hˎe*!61Vmyh(L6 t'l=uZhy鑶 B#g(@ .s/70_жj.T+s6|`Af☊$[ eHy8im3XMkDŠ¢MHqXm`F m0R:O(&s>$GW\luZhyj!L0= 7JU1D(i$tN m0O]Ra*$KuH$ T"iۇ.R 0*H)} 'k'mI\>ԫ4C1u2 )QBÂڶPzG̛L2(C)(|Q60_6uG=%J(UZ Љ#G*0dB+l1.,H7n |ih[L( S8>ǒ4HoRW63yԍae0`0؀YDnb~s 0ϰ:*dȍ@g%(dDz^&7 SmCa)( = 1dUfݤ5ZhyѺ3+>Bu(6 ]7`:-+xf >Y,(P*h:_6'08crM \]w5Zhy鶽γPKIwI}v`s39 .d\&4Y,$ٞ,eEmmcn>tUKa\ix•d$eBxnYp JaBpE$ͼLR3DMZ!&Tr'cX"*j4ߤKN -M2Л=L>z ߠ k\x +HRpI l.JI $NMZ!<^gۡNQI\\Jsl:;rc>UTtB:!ñ24j/ͼAlgR,֛AC0:q&O`c1ь[rkrMZ!s %04qGUbIK\ _B  7 5yMbf wejJX?ܤ+b~/UgpY:cXg@Bgϼ3,2Cg+ 09/d Z!<64[& ༂OcR ֱĥoV0Qu{MZ!!63IHyRi,uMZ" /|θs2K 4%Ͳ" ͺMZ!Um+(W$/r)LDuMZ &D>#{R\)26 X͙qrDx+C6爅\3+D«Y7i!TN^QkƮM41VUK!6%56ߤ3FCl"ω&9.b`8d&ĉR#|gr c>K-]&gSZXk&-Љ"wEiP:quu@x$JhN\}iE~zPVoO(\Ȉ s+++)mUO Q2[+4[2u|a͚7^9jQ IEw]jߖz`%Ѳx6JOoV2!h+{fI%Z) \v#s ;(ƳBp4zP hMT%co1TN΅u4/C}Y)Z$EOYaC T*͵ԽRsZzVeEaMQaZp>Wn~v֮ݣ[f_|0|U!v*l uzӚ'7ReNȚl~׾}&(eVv2-o!ƒ6i"*hsj Ձ,w/q}d5֔4:IB^ŧ(3|jRaILݳK;bjD bq SWbRަԽG3Ӄٖѣdr_.!`q=eu o(Gb<~VlCSB;+ޚZ&CWJq\j?GWP 3-ʎtUMxDtdUt(HWä+k+%pY!'+7])ssQ%BN&_WRe9XZj6Е ޷@ &;6ϐ?OVL4۔d;Z~-WiZ>@ )vT![G;f]]/`yx岶`Ϡڿ.W?}:l~8qWïir͈2Z6g.\/Ѷ&O [Z9հOL~9Ivv9..㧢])7YDpi`+J{|ߟ*Dʴ,xs|)sb /'O~ߩ]|wCf ܭ29P T, Dk{I!&8C5" VB$BBeej8t%UBGDWXS ]!\Nc+D+E Qj6JYS0Є(=%U^Br*uL3'&De<>58VSQVjr3pfNU""Bdz Jic+@8;]!J;Js#xLRgZ ]ZI{(+CWFPncZh Jbb+D+x QqgteDDWXxA+9hUADYhxvZoo!Bku-#FWZr-]оtwu]U骋ǚFDWXQ ]!\ ]!ZENW=p=<F]!\cb+@k;]1 jt4`qUBWV2wB#] S6" h jwBƎt5@0]`Ny4tpw| ZNWRˑHWJSc 0\eHWá+ͥ\"*voE/j^d2"O ԙ:&gxCRLmմ잛r ^,V+HʾX*gmB):(RxZ KK\gR?MNQl) \-3\ש`3 ռ)* B$jN,I)w]z`gkuVW{2$jG5kGW QjGYzʙ![;JTjG=`_`"+ մjJVmq&=+3a,3rS߯'} ]I:ww5^nh~2,o]`LT)qeWQv WF} 1[Rʮ5ޗ=pxY6 `U" њoEv Dc)-ZQuZO´QDhMbv@ӕ#]G:~+fvVmng >ggB5yQG_=mZݥ)e=6_]uI|KwnIe׃I%y\ |AV.Ay߂V.O $sEσ>Hp N-؛[!-m*fmrZ])j!8J+zTo*UOOY:s!0Y;!kr]yÖW2ēDG2iC0HIH7q(D2{}OB#j[JC*yfQH̢{NigMeHLW2RަfѽGų,Qc!HZ7}~$=Fs&|\^ҘV5hnoA'`#] UOOa^;! 1qZq$Jճh #]uU='T^OgenjjVQG쫌g(}i(AJF \KbhQG41`DÌy+I \M`O JG ]q˸GCWX c%jt%=pD[W ]!Z{ Qr9JBdb U6Bt(+7ԏt5uxWWX ZwyL++ONWґHW(EIDtMVpǥvhőʖCH芵+6UWSJUTtVt(5#]  K ]!c_w(] 8Zu(X RwBLt5@Rb"+;B5<쵏dNWrAҕdJ]`H4tp}+@ɩjt<&BWh rwBrHWiaIDt0•Ѭ ">2t8]JG"]N )Վm<+h+D+{Qq# JZk~鋙ϰ/^0t.^z涅~}7CI{q6s0 }+tT3P˹1FhxƳA ޻ |?<~vͱ0G &Gv(EϮ=-芏tUx8rh%X5B-\^_ʥ*~Ƌ#*PJZGSfM&Qrh3XMD <{ZC J;F4h#O7e`8}诋ON&rp⮦%>Z!K]@&\,6S%$[N :fV/D?",4 (U64&4˅<<76Ǿc]3jt%fTEDW-6뇢+k0dҕֈrnh ZBc+DKU Q2;JITL DFCWE "ZNWoW$t&t=(@ $.&4tҕF٘+5U&B 4t܀1Dic#+l$Ck{ J;ndxGt%^ vkf6Ǿį\JZ~$jR,.ZЕ骫 ˣW#lW ]!%~#] ]Ib*+e~ Py׌F:ZR==Tyؽʭ뾜ߗ/_@k(z&nZo`0`^?-vC{ЯS8w/ygT[m~Cn6_?Hs_޿gzvJ vT퍞K{.珻6u/Yov溚Nyiw|f,y~׫vZ~kC7~vcc1_"˪DsR6u+l]%7'i/ Ǥ:4uCiFDK'т^ Rkssg"+ʭL=sIw`~5Idv9YS)(ʿ- uz6۫_.|klq^d y[`$N&:9ۿ_0ת~/nUdP ]e>gwmy9}X*LWGȲ9a>',fחo~v]#g7nyg0ۮ*3("4e[ GyP]Mx X~;{+?gY/]/W\\Φ`U~(r//O FS)4>qf'ٻ8rWnbɽyهܗ y -l9M|GX%YJ^€-όzdUsجW' O[}`5??9δWoS:,WW?P\oX' طIȿydٸFw\?]Ǧc:m}7ݿb<Wq}iYLAǠu^, w%PZe} Xچ{~k1>Nrv~rz={J)bchxG8;B4 ?\vo%O//^K7DA^p%OnWw.{ ^ͫ \jAɥD:6h *鋕wOGLJ>OABxtU ?{,TmQ9r?^1ԲأflfJMjbAmbAoo&8?nY{_.σ}cm^K9cLڅZE`7))wUTBa QG )rpזQkJ.dc"Shzor56ǎ%RV.X;&۲$[sxVn*$K6U8;pvhbZU9&ĭZK)Z"-rkPM,Ѧ*VɗXduFMҭ^r7HqaZLm)yVG֓>Zޙt6:7l0)jO,fLrw}&f,cw.Xx38fg s㋺k# գԵ _%מ`=m̔}~ʐm8OcK e]OÛ\w-ca "!C 1mXhD\{5u$cLl}G&I^D,O0Y7H#LG&v8C/Z1Ү u*,JKy+!(In!U*bZ\͚[f;a%mjS4=$lXO0Y7a 6-&ShZNyʱ%3bwv3k&HKvxKcM(HS'$xHR*08﹅'.* 5J"l~S\T#*9Z3݄f$'c=w/55ܜ=2f"^dBPL3k‎ޞRХہ u& ]0QP׌L!2ƒ`3lؽ:l@> \ q9|xGdGx "(e*2X|C uL%,"`,S^N 5õ[@QM`\ZB44&oSX*z|Z4R yٷ0@<t@]5ְ- @Z{)f4eE UhYjz*eiFP٩VJ4c`kd\ ੖rcKK 2. OM[ZA ߖ*hWl;JfP57شTq3L x)nipHY0(3idCXQ[1jw'Hx"fG¦iB옘?%J1JD5 c1EA2TY7q-u Oa C BРFLt6t0(V+m:d;KlYgO!AOђN](1q5uj{SRHX}۴lѳڒmFݵha,Viƽu]3y ]3Ș Ly;izv4#Rq@!(!A/-T*zH3FBsiP47F{R ;844#9)a0E!yd㠌PA>/6;l8HT)n :9PV!#7䍢e5 ‘rw(H1T1t#IK@`:I@UXd L|ɴ 48nK^:Sw kS+H͚ Fڸgfr I7\પ>ҵ/YoA^ɍ-Sƕ jjZ]g=Q|Бr˓׬VV+vxV +^bpr4)ҬX˳byV,ϊY<+gX˳byV,ϊY<+gX˳byV,ϊY<+gX˳byV,ϊY<+gX˳byV,ϊY<+gX˳byV,ϊYl+#ԑJaq*^\z\"H*˻sxA9 w3T,ԤΟF("H*ڱ^|%6Cz ENL\}%$y *$bw6nlMQg6E_{P7[Ǥ"6wzqrRwx9mkq/w ƿ1c>1g?cLKppS].ZǪ]ԲPv)>T~Qq|X *NOJk=>\#ķ>?^Cme_|}F1?¢C2e]of8Z%XU}-==ZN]H ڹkH zq;Dr{AisKt#o-fRoo_$ZeޕMY݂7Qg0H`RNW`׮|W15@Mm+_rSb(̕ˏGeo_hBuy"7{JUATvf3*^iv)C`Z`U4#w]jNG;.g!ת6Rs(g!F!d_; 9uT兰imqD}8̜.9YGuQX)Yd%!ɨTE.d5T-2~Q2Q? Oo[cJ#wMcnwz0֐O֫Oq3jJ^YnjmFxꤩ{خ`r[kkk׹ԍX+]qFO4rEtz2bhSr0fҨ(ıI8k?=1m?t4č9^W?|qX9pDgؕ"cn~xfW4QUj$ l5xU(ɘѶݎl|/ f. _ DȨ\I1/fZXyǖ'c|Df萍Cj>~S q!|d}dȗsfٵ۱3~9v;v.,{]zmG$ᶫ ]}l?}l?}l?}l?}l?}l?}l?}l?}l?}l?}l?}l\;Դu,bsc|)M[am27mVٴOٴuiW{onp,[ Iw}ή^S_R32RF8E/ XS>6xw{e^;4؎-v<{Gu?{ܶd J_ y?TlRlުX)!)rjvQ!$$b"!̙י3r:n7"537Β˙p ]YQ2po&Y;F͊>?Q%X!?;g쐟C~v!?;g쐟C~v!?;g쐟C~v!?;g쐟C~v!?;g쐟C~v!?;g쐟l)?O~6S0o-2gd~6ϓ 9:MIn藫8l@sC2aG:&a6N>;6-ݗI޼0V>y֧Hoh]'ҷJl7jkӶj9ߍ&:Ie+)yARhko CgyA-]/$e)a,k26/n ×qo^>{+-{|c~w!gz5]A?ͭbx(^yhy4f?|g쩦Ft gYQWH[!4G00UXW1xč5|+2Mi;a<^8jO<ɘqA)\%(GE}kL0Oto+:;/V+ xA/A  ^ xA/A  ^ xA/A  ^ xA/A  ^ xA/A  ^ xA/A tFv$>x}tHЕy/"%s.%'l?Z RoSZTCZP ֵnn# PIBa&"\8&`dҊ1<W;\5{tS|4^<R ^ALd^38ٜpG8Ԛr. pT)T #+T9my E&!~ -? ӜX|E`l{_]IZ ?j X||?sƋs$O2BJW$AI(-&OD:Rܾ=t>V GWr&dd+j$2W$/0X+Q%YTS&9QYR9Iue:waeO,Z6H%$^-iS(nU)w HҌ`fasdt*(' Djw"m'ZbRn(dOϪ=a4ޕ~kFlG(f ]&G7*W./i}‡#Qg' 3tY|Smv0 ޯwY)Dn"w 7 G<<;N&w'Sc7ſF6IFKa`mz\H_AXkh8\9Em.t#d'wJ-|}G1V CMlUn]Vl 7pmJUFэ͟fG~}KŋUl  x ;OlQVX47b;UYr]Rvf@ݾp%'<"[1nkk ;\sҟnlhLâPpx0F_n;)$'xNXc̴ej7p# (۩+rTS0HSHB扬ɹNuN\3'Upzwdq5տvqB˘wËDIi 2fVh`diif9 PBuJh1GQޫ.ߪRl d׆Ciah=jm~_ McCqY+J9+-7 ` cM6 XY$=+4nA=Iwn6a<F.mSdڰ\k5=JS-X]?ȫ}E:3WFSxW$]GFnF2߽uoޖt@6ٶjq[/::K.gWΕrL#GhXfX*K|lc6\y ̤~e[Ve͕6e 9Ejpx]eYemw>߬!n:Yai:o$ZtZ, Wia̎3O_\v1ߖձWV 0paQe+y+^b0=&إj庀(=B_&j?ЪRoaYY/_:xcxA VilRzE FsBp7>45?P5ч6riĘ5ՠvV:V=&TۂĜtvAM"Ƭ3&kʸͥ$XXܲl+ۤ{|,g^}LP;{^`^4KSX'ylqf KL$2)c}/բTB !2OǣAaeSɕEhTiBUs6V^|8;WJ^'.۬r,;~v4Z[РʚGv=C-2C&zCٖC5۶]}6qb;[*եZ4ΒJڮoYQXڮd{C_9ݹQqjaf|F&LS%,]L62e[s$w[֛NnI Bvrm- `XJ-׬ q+5kimX[6Z֊kYR0CXp TTV N;N2k˵gW@c|c=֞=۷nf mR7 MRVT(o`\o&wx PҐ^ / ^^3r*_j쉮; Jڶ)^}]i>ɰhy:t䲤D*~{7Gp/Cc3UUFX|,SN(jų8ɧh?ܟotWVBZBWֲӕ,XO{WXc]!\M T,d]!`!++/th:]!JRv:q`T|ρ Mr0F(Xܕj@W*ն]ψxDW(B7tp <]!JE] ]qb b|+DD Qv@WOBW2Ѧf5PRJRɸO {CWן+D+T QJIJ+ex{ J(,N3\. hy.`?{>+|Uh5x 侐+U!\\5ӆ .#޸ˮD:@Bq.<+,.-3]+D,S3 k+k LʺNW@W ] ^fٿu3wP#lu Jڶ']NYŔEYzC92F N愨^j7eJܤ` xkGpţAjkNy OtoshBlR C+-#o 2 ]Zm QZJ&b+@ko]U@WCWR(Gtg Ja}+DE Q "])a-ѕ ]!}g7C: "JL]ZA:OW@WHWF*0ƺBt24d]!`3p7{Ww+}"$7CWfî7DQ0{ C3t'=6C:nЕ tmSUf1Tu%ׁRa+ ]!\|+Dkt Q@WHW\­Gt)et( tut%4zDWBs"ٻB"ҕ7W[+̨?{Ws_ [W@WHWJeGtWYNWWxs2hUBi] ]iÄU9Wy "ZmNWҒ@WHWR}7 px(>?JFJ@p+ y9᠈anaOGwVox:;leê}]fvS0~+ wtj>>qV|?}#h jHJFFx8Yr@t%i}?O!s"Vxp-aL!Tb^Y Uo_\(X3J٢wG{r@ 䢟W?_;Ɔ( gJx8_i LHKu]l(Bdfe }_uTKbK:( .Vz*\`D$4 C^|^wy; # ) YJ>d/c_ƇI_/mxYr _o+RUk] g~>\]dj p}wb>xey0;E#~9ؓc ~%L?ڮ,=z*/]45}0noBSf>Y \~MrNn=X]XH>eeh'nbi1g;Ja8OE̻7+rV/Ґ%ù" dLϟ/B_TVbPVUpv8; n< ]e?"A? pO+cu^ozl~/>EsjWm ۥ5%ࠂge4W¥Ef6+u'j-ى91YH Eb33˒CELJ=, 驩Z.߳U5ە,9S_SG/fvmZ[t"O`ĮfXPjf+<ƹ^mjq`|/Ű_7]ݽ y6dngBkB >^,wa:_}"kzo:IU:5 4+RfXOdcL?a3a'[/齵f#T˲1'٤'= >[B ح=[8l]aq'w /ָ|2c:bNzx9S %(C;k>L6\'l y@H+(ZvUn2[Q]lceG4G4"@k1k `@eX-"]Ds[BKˆ^FcD2Y+냉KM 1+ SD[#s {/5w/:t֭^BusZ?:.wm;"-d327Û/6 P(oJ%{˪(d xU]]B43!昽(9(^ ;"'^su\U@̴$282Z9ve'[j\k .z}"uzuGIwk|{E(.qoU9PqWQc1H9;1r"ZQČ^9AD.i.)kf!}#AbFtUwoO0;!NȥEQnY@AZ@G96Gn/bR2LtEtAb&y$0Qc(,<\Pȳhgl م-ua6pT^\uz[⎾98UˍVK$%S*r Y,&1pG.PTGhH7dn8/Ӥ%ěKב HitL*œRa<Ҍ;$4"P (O 7j X q(B}&t<zAY7Jý@>R#,Z!SK8OSYeIl?g\&A UTs-, D3bց2jO/-OX+Rty¼|Ù?}p{|V1y_s?$3; }=B$\ZOs"a?WgC $..&qSĸAuB4{_:|SmAmV佻Bly٥p24pXSD&hX J1u{,^|nS  يN[lB`d!zr)og;vrGIBQX|Xqc_Rv8 gTd%_*[tp?Mrd #?ͅe:Yci\HC]ѢtVN~Xx5,n5Hevd 8Ԝ"y2SX; KP[KBFm͐f4km)*>F'ó%6Ω:uV e'Zm+s}ԸjLRCRKUK^1Kxgz+rz]dAQ`ryxߦ^=q`y8}m %؝$\3W(%O=xu~Iĵ=.`]Bb֟sc̱|6 V# $g\Kr0Vfxvnɚ>&G[&oLo.Jir󰳐 J@5ZWhkX 8/^.,S+R3Eymw1Üs S?}]zԃ'׋].a"i8&j:B$-J+NaXj~d▱(;E<y,![nu '1beP=(zV`<KL k.PHa@1gz4‚SP(tU j_oI>s?[ROm}2lk" <|#1J$KQr=IgsGqQM`ލCMWWYARa5dW>/X~/A#t}4Zr4G;.#\ȼފwoR˛ahI{<- 4ծ)@'Gqmtnlĺ> :cr7=E(Oç̼BѭcТZ̝c7t՚ opr훣~w׈>>Ah @hPicuc>6Tw9<*%8[@z疂dp潊JY9|-}e\t'Z^GL' !]14iVd4eP IPmiKyah m!MP4rfb:YAW Ò }J6T'H bhc3[ٙ [ʖ ۓSUjCR!ClI@r& \&;ђZ ;=53zE6ZgANGSd6mȩFM5n0(* %B# b!hpDTB@Ѵc=댜 FPS&2RZb8CNj5f YG,HfUK8}.0GgQqr8F)|Rކ u,`zra'qҥpO5dT<3{0@$"j$@D1} %:*RAv&$d;h"Mܴ) rp:œBRJ- #1^ۺRPc=u&"wU>ѱr"߬2%Yg-Ga *hGS (D<գKX܍'gOE;:jX;8 :/cufo.TEH(+W@"m YnH 4jrɩȕxWrKT{69CBΌ`H"T8͟80p+f-nW;70 C%~Ns$ӽoM$IoEB!q!mvT $=$cy4 -հg)YMR8SՅ{zΠcq)ۦ@ܼ3o/u"L;T6B++n"2?dd-},<(RLEgtq/7:yw NGӶ63oxk UĜ^W[s/r[ Ĵ..Y񖣼Irml gy% L@uV$2]nzEVkV/lP7?36qL4kn9Br=_X6X-(R4'0C:eIOCSW_ldq:ec&1+t8ּϗRW!Xz&E^[(&H/y rgt !лP}XOds?v]<'nV uϒ\ͽMkm#9Kvԗꛁ}psnĘ"m-TϐEqȑ8Hiز9þVu}]h4ma|S-3D3QlLh"Z".ңݧC, =h@@6vܥZIŭmwN||̒2頰ˠ?i 2ͳ{;+jj t`(`dnWo;u@CixꚳXL]:i8g~n=%2Z(..IP(WP,c#i}{%Yy4ުt29sNj<^g[r.,eD\eT@E2TPDNSa6h(Lv)t ̈́RH#ଶq. (.(CS0d'em5gMr~Lf%']L˽5G]~(/1`I&kШOItxGYBy-"IHVDI#آsI%Y ?Lq]YPl.&qYSSI,QAuV=ɡpF58T0J('9A3;+7=Lq:S̞kI䍣gQ:o.! 8 }^|rabt19:ȫxd˥d6J8?7+\޼9^}:_(Дt0ɛ_ 祯X0Ufѱq|MHySec4wG{;u\>x{=}niL\4~ӠqY,7w#Q6zZݿ &ijkstUW @m5ˢaQFQ̖|4zrWg?9MhmluɺZIs  e0W˷{**B%Uaa.Q?~ӻ)?o?Egiy]䬮 {Sܡ_keMlQueb/g_?o<K;[w 9j2 0 e xOz~VD݀Hր(kh0931^NPrM[WJs7*KF4P/\j6ȍ[16HHb QJ= YBb8gQ6w<{8ƾKgmEx #њ v~l'9dOu;u1pREILÓV'r*" ^R, 9uGHy:nEJН&%3$PT,D0yÉ,uw!Qq/&/{)GQ'&DŽC sRs5?>50rs+.8w_E}F a͵KOϮhΘH]Q8l!)a -`Wm8gވ%Gģcw~meK5df Ė!+#jr,帣[YnouC] Jݵd\s7qqi'wRP0ƠodX g+H¿āƓWt6zr ^#2΅H©I@e$gSG'?LIEx߮oU: Վ-&~mV9`}q%%t@t7ߒ;Cح[送nӭ#bYNZh(yaԩŌ~kb~qm <M2k&yĦشI&*COv[`s.a3Oգh|緋p'[ ]=, 7 |Ikb7 ^^,AKonpW_;w}n"5zpS}uKXp좭zuhλ?鵋#/sH|h͟{sE.#ZyU^:y,\ mA!]!_-?jQM<KELh  %8F1 zBű ?VY' StP<p.Hfwp@PcKW \O@$SBe80(3"`jDzMĴ|骵4tsWGG~1$%PRԓ F! meБP)X [>ħZu ؊խdo=>^Oc ⾻uw>zH2)V?(v%݀ԂWI2#@*պ pQIET˺3œ [Ӎ k'pkM 6]TΘ VsVJqοByN0{-6YJB FYC| g M/^In|_39fq\HԢ& +^\@>SgA#<7Nl])&.]GЌ.Ao7B[yF碮|\q m4ؓj4rgWEb?}Eig~.v:T9"-'SpP#ozz)T0RDVNssFADdFaCdש.ײ)Ф\*M2cPrn-sU-*7 ?J]$RFlTb8lij~jjH<6TYcjBY,K3"4"ܮ4N#p#B~23 f`.pTz~e?5(J18܌]Pwx:^ӷ|.[-e4QEj۴}M[M/t h[#Eێ`9Zy\Ws2+j'3Rn2oz7mB_(PXEV(8u܌<~1֣vh@r}X(YׂؓuuxZ̀Q'Gh (*;8OLl81y!xwr;AaїA2*^ L6ݍvkxru۲Կ G0~8!-%iK:x- h$ I@mm%ix=-"I[<) rQ)O"EtDEScJ xP 50%mΆy9rK>Hy::rB c*0*'S?޶#lcJʦHʆPcOXnhhS%(^*yRRfudx*6Ώ=O1We(97(}$FsEMhj 3cQܱ$I{4xyZHBF.6D1I6K[N,Ip{n-U=$YgT%`il!\zϒ@4Ǩ.rb oYZkNC=+1yrbCRPRCDUʥ%Dʘ`(0Rk\` l 7/Eg uu(ׂ&) Qy"B/(OӘinx"ZA-?t~R'-3p'E (ÍH: !d0 #h9 :XLŃ-,G58 71:y6?#\O#3(V[jJAp*tގzM95yɖOΔz=ʀQs4AH h<ł οh$1)DmUGISC5n^Oavp+;K;Y:].ZHdy@BGP>мcA(~N3Z QAO8mYa@B6#1pڷp'\Imy"6$FN$<%⑀sA2(Op@PcU ɀ9=0"ALjM< PSfEc{FW!v~%ya2sF?mmlɑd'}&EYEH0@GUuVRD[Ɛ-g;\SSXTC_@bU@aPU! x1*ws5gQǕ^]L+O"3#.SkʜDJ^[gQl'?S u>zH ou^؇+4oKN3Kdp(b9IKʄBHh]Ew0WCknrK8Dgt"(7PB - ̣# bR2H$\7؄ A2 c*5(K)µl5ZM@4t|al҈Ng% J9 )r Y,&1pG.PTGh`$x> ͊u;"E Rr 0'T4N+!N\[Ed>*C㞅ׄgº M_`p/H,ygNJ h, N`Ir;t=?ij"H!Dp0} REA5@4#`J Q{bxi{V^ZåaO0\wGv5gs)$5 —"GR9 lh+b%7q}J7%q\=IQIϫAJѕ8by-OͥWG=I(v@*F"rtk#3K{q{4< :]UÀP& YvH~q>_ i?qxv( is82>d%KZSͮLJfTʾ7E''˫ŏrd #v<&7L+?8|[q&.RAˆ-==1"\[y]q7lb~\a "`OӢs3" _3fvTKeLuӐi4fU>0: (,hxq[zr7r0p Y W`{eg$s!"x[*T5 M܋lW #ywvP! `@]\m:,k~cln}}W1[S(ƌ$ŠIV}c?;&2K?+à z#(wRokc39q4;N]D%BFMgVzDd$ \NcGG*M +,y QRiCȂ;udHEʅbA\3 N< TPB=xⅼbώZyyR2ӡ֦Cct4;^i~3m!=*DBB lwn)1H-άOko/%,ꗱR;'05(ژ^0פaf(/,3!vNꩄwl\zȣ?ɓ에= L%xg2D',"k/%p8%b"XZYh6zd1<ku;?w/v}ዸ Yz2rK苙d>(˖tsRfUZ14}@uv + E܄l!. rfYi6a6YެXXϜVotyfpՓM9#5ٞ^fXY[;/>8L'gT7UZV%'ӟ|Q~)KHdp~[~03;u !3ctS!Sw<#Y૩ex˖aIkŵnH6npPzq3l{, fxʷ0˧㫋؂ cgmI Mϵ,wJÃ&ޏlɏltd1q1P0vY4a#A29HysQ 9DpOxduSٝ a J}P r 3s6G)\Ʒ-ӯ)z[@H_jvpklgX+,`I(°mi9s h~(PAB< %a@ݺ-\s|WJv@='=͔sd?~]dv·&6Uaz=NJxdo~~#FY]-w;uez<_j*N4Hg?_÷pPǹߗNy ӠT© pCLzw7$gЛՔ߿ts(Ovv~ GŘǜ-C,q;[ƥMFS()Wf[4'59e_X9]<;/<79%eZig+1vhu)H6g}=wݷBg&G`9z0LPo@`@vV8%BnxGN'q27BQՠ/6)ºdճD#\uwkϗ*ܵ}.=k1OwW?rk̟[/얿3"ڥ)VjlD9%zo+XjJVT`Ļ^f}wƼ7lQ]oR"rG_`#gF9^fY"iSu -oK0ly[2cJZ&so!O}1$m[YB #6uA?n@z[&}7Tu%}肶L~.ш30jkȌ&^go%̘wd&X1]мKOr~d_Kh!;>3H m{3}Ռ5du4ymyX ܍kH.W?SQE{wsm|ĹM{摳U'\PۏS\zFEq_k/Q֞TME*k ųxN 72N,BcQ +IHm 1(T(F0#Aru=K#iJJya`"ZA۠FHbci9I1/g{CT&_(0ޕ`6iݫ/72JiMiVxa䅯iFcWWηKڰ*Ӆ\?P2󁪆$PT[ f1pmSÎhcDqzmMrs@5せ< ,;l2*07&Gd&qțAJC}TH#hU JB#Xj!h`Q\19"irrY!W>'1QGg.?,5)2R@[b8CNj5]jePNY``=Ӌ@"pR'g:{QpQ<+. Jp5DO7^Q桇ZA-7O o-N+QbDCRSͣx @Fr&7+V$LT}5XD_bZ]^kzoo/71*:86/"ʹÁzs I V[ *+w5m_Yhеa 8\+ "K$qo+ɒ,ٲd $wWKr8p8#AVښ`.r<_c,PS  F9i1)D0`Sق-2ο $mlţI񘂱i qCC7,ktb '>RVLoNɸk{m=iҕ* MBcA$6tM[!aP43ai/xt6RR;\; 'DJyoS:rB{ӅS " ;f`[O Y3>cylz"P{uY! yПd$ 9?d~:Mj9n %u4qZ0Qy "$yWUv~eo.8@1~:ƢWr#|.+K7,"Tp(濝d^8fsާ^i/2P oz>t(3E"Nw 2xpp"Ө|ॗA E}Z4{U7^5o)= FrŚ5s"++Pyrxy+Z -^铨mỈNR[&z{r3{HI=Gm(gR7mB|jEc*|*è-k GTDJι/ טJFu)&nՆͧڲs4[͍s[;wm;w9XV'Ь~vYh 0SsCnFr7mk-BZ+<=>^y'{3}j(a$KR]i0+A$T-ip/f[`h|7ztodRTIvPǃwOq.]A$h샴IP˚BFcBbkǴ҄'vo< Pyn((oL-f6Cǟ*\fvX co n-|FMaX7ǃk[<%z. b۰âB֖r7*"ʱى7p6 P|be=ōѐ@8QB*qBRĘ6~cj}OȽbBͫ LohB+l"Eg͹Su|zSsg̼/z3%j6*O'dV-(]hkbRxC rHQ)< )uM+mD!X͵6Lp48mI_u2E51jXlmĦ mشs ]{P(!z.y0D~=;VG3P&i?->/W21g2񃐎˳ewoQGEI:e~k:{w~ק#epak8k?jLH45^بJsܿu p.NCg <7J4ceكvgCYp"J(]м\ p9#YA"wU絡4v-}U8K9pAjO8A%T&K%uQ& 1 l;|05E|L# <3s *VS|Ԗhbͳ7՝5S>;Y|bZQޫI@~ ё\xϠ<%)!90)K u0(MRqsH2&hCH:^(C|8taA&,oW10i-y BXCX!5Re5rhv>W+B'Åʈ3w :쯨!IHLi=y.8"LrB4J-HN :޲Zn,մ 8Pq-cmVZ90+| M %wQ{AQf!~$rI{(j )ooEQkFu}h;X;[fў˛'r YCZR|ƛgޅO,3jow2SEW[6U螕bY8swا$$ ?#ǹ,a pW.еQ@Jz!YvDX].v7^!A:h(F Sw9/nGWOpUj)S )blÃ\SDe{Cf ~?G]q*GPC.Nj ?+w/Ս磓WlIgrIigNN/u;'5e쌣|T+}uzSvB%eW$NW5V7cffYa^ZfZPAWBvF K[eV'\ꪱ<>muODBr WqJEH?bobRo{imwN] 7ͫo?<ݯo^pD9zWG޾D R`T%_y~٨RmЁMM[46]YY0]Yc^|q5)Nܰ 9jҋE|{wZ 0~*(A p#֘ ˚1T^SC^*eINcD2&WHX.b ~J Yp4/n@ vG6ƾM3I0 x$1+#1ˠ-nu7tkXY=gr WIV9K#w5 Yϙ!sЖ(~(51u ˱+/3{Fz!RҖ:n @sU@l饷eb `ZC {v$l3CӁZZ.IyY ("Lh1h%TIn<36S &}`'.羳um}nE7rs)@;&,ZC÷Hl9TKo, U7G^6)K!/ORyE$hqN_PBaPzTރl]3\x$ o;ZPׇՃ>7oYZ{RV[L[SpJ&fZ>罥BehuȩKٜ+jڭTN%yڬ78 -kAmY>Ckd2?K?Ep,#Kk%7FPeZjl1ઊ)^![+ٱ62̎r%~ ]|oʶV8ac=s9on'PbLޓU)Cق`:&YtMJi*(%FdKQt)]屍γlŎ^}}(`x5 rˉ_?Pn GP$>\{Y^A DF\.gU)@"A&ր]Z-IX.7 L@(8h$(j|L2k< K+c}A2f %Di&)ZGjpLHpЄpZJZ+i۔-Ŵw!4jZ+wy>2;jot]ƘZ$渇]a@J%"r48Ѣ4x_&D$VKQc`p݇2" Ȗ<<ΡMi GcBMQy "$yh ,b>/.pFUg9⡂53-t9ʊ!* vnO4o8*1#YBO Y b`soA_eF{~gW"EPl3ÞIyƒ E ,T:/%}1sL(QmGs+cjʰ, -J))$}-\hapi'ݞn۹llpN5wu ێEl-}oEIN_'u{զl+5g._4m#/>nƓϯ0F "9\>LK O?nu;9py,Ų%rzfѺ}`I(ڰԶmm: ?54jw- 5 QY( ½mu^4D^廹{K9Au5\6ȃfu49w/uoi?4hJG<4uxvbwV3ՋDTxS$۝34yR8ie`5TThNYAoҬ^(IrYzE{'6i33tk:_rNm:v7\ݛIӨ7j\L^nU9G=.thf= < 9>y[們W|~=Lgƛ}bY-Yv={h\wuJV9Q6ogNOλڞ%YmwBjE69IĿ颗|&f /gɻ0~^$553c_=g;_G?A#KQe菗*@QW Y^b<"9Q=_mʚ#X 8 $(˧/yO0eI=ufv,5'?'m23f[z'Nǧ)l2t ǥ.Ƣ/3f'9f@zC#q Ag'&Uh57:17Vc1=8}$;I:ouн?!G6C-w>ǚa>/:fm>CPZ >wƚ{Nrso{qi8s]ϭo39a9mvǛ'/\7GCQ괞KwRd>Ecybdy(~#*' LJGNɧ_>s~lSuQVm=ɞ~Ԗy >Jvȉ4H`66ܲGE2C} 4ecĤPUi &tXNL~ '&JR-WO~RY%aR/1",DDf3hA #(H8Hg;e$uOZN/'O?ЪԳ6r23+'N Z_ߜ_v~.\u$DV93(* %B# 9`R1HdDЈ H!!H% M3Y(g\}9'1;r^j:|Y:DP^#KXDHicm 9tA?:ea! d&"9EkY>@O!pUG-GhD/(O=`CZ٠?I >J> g]#Jق<*ό! "I,Q d,g2HyCYD+|8!ВYg-Ga*hGe_PRy^#x,آ9HGrt! 4Vgnw3GE;Lҝ#5H!󄄒 =DqDMm7/T\Y Wq~=F4,b1Vm4}QR/fR.0q⎴55GM9maY45 + vcG@X T\ìk#G:Rj+fcE&zG+4rA"FQk i|Ubceȑq(q{;{k@[pY÷=`hЮᡧƶ@HbByB TPB T7 ƈRXV)ZJRj)ZJRHDmOҷ֘%ax#Paк5'._4#/>nƓϯ0'  k(en?flr#zzfQYz:G&mm: ?5๸j`w-6 u.$ccd#O mYxlyAa EJzb*w] >4_#fϷsFd4mt`M Y&OƏ6'.'xSW:ř-KRT~.KRT~.KYr_^eq_,U.FmJ >A#;m./5(M,#/g&1c>|>TR3~0fc+33`:TFOF58XMdx?K#e^B=}f#]i0.F5љZ=ؽf@gA=uf/5ٕ+^ͷ)Q3yѐ4Ѷ9 mӫн&?INucwӘL?vہ{Mz-$<نӶmwyR֎6y{M~֕瓛}|:;ghN vF^~M%d~ _BeD__#>]lo%JJb/sc% &]ye+.Q^!$Sy';<3>w(.KeSq'ٕ8da \cI B/]B3;G 8b>RD+μWQi9]ʧB7OMYf~TgQ쮺g5Qd 72N,BcQ +IH2FFdB1"I%d(-̳2?ڱ cYJvW+|xȺ6JFq1мvsk(Q*Ӱ[sŰj#ZMvD%$ w{V$7Q'EB&X,sc"(ѤUzKC}TH#*gEC%Dha!DHD"#(FT8G"W hA*!%AFrm:Y-W_I̎,^/s 5N Kkd H)m -C!'.2(YG,,a=SD <~-U7^84((p%IY'cԱy(P+r'?i!ݗxD<<*ό! "I,Q d,g2HyC NHpyY#Q `Y`WpT) 0Vsl+X鐙C~AЇi2^̑#UH>l'|]VKw;Dm>OH(̷QrFHNWjJʙ/E WU_b* 6JU!^47B1)M&,m<*ݖyamg{ JFIJb!c >rLqy^5*Cz لj;^sXe3Cl7W5y%Mzgpz0O;ԽxFMSmގVLNd=ǂ"%ef[V/ %D(J!%ZYJL%0" ] \aNRc>7O>M,d =!G Қ)U8%b"a1g;Ja8~ckq+L"{1 kI>} K铙|Z_'ݜv3 o~݌ƟF;dimdv 7†ylڍtaT `J 9ӈ:* fZyaq)pgI{8+}Id"kgc%~A|@S"ʶ6(CRP"!`ɚiv]DJɞ;ak&?~aW-wYݽO'RH+}[Nu]ٟ9A(.2L8!NЁ8Al3E:"#5 #< :ёK(܀eJi%d F!^Ţ,ۧ5tΆΟxW-5(e97(UPOMP"g1Kx8rA728$Å ِTxG$WGǤ/,Et) GTqXGMy_Yˁ{C$( :{קp3f݄V(`т7 Z)x^ <2˂>@{/nFRM0RPLrTQ>\@4#`JiQ{bxi}Ze#oOn`?QTDǒ`sh^l! \ H{"ga?Ia@dzw%v( ߝ]"g9)#U"p/{<]Y"gNf0w1V8"a:>KŊ0uǦ(¥Sa^[  ڊN[lB͂i 2pui0&1ΰؤB笌7ٕ)EI%lW˫; 9Z2{5nA`3ӊꪀv`kBȍ)U fwUóOfuUwޕ7~gKuD ̥2,շ(Br͹P; n]+]-CڗѬuyRlh\(`bAtqr0]pj>tUVٸ['`RNWM*&l0ngnpu?][y1cJPNas0W?@y vmcFf y4H)ʝ`S<_ _Ss>E=/@D\H6`F"/"|hpaVzDd`/I蝺i$Wշ(d:2R$EʅbA\3 N< TPBU@Tg4L =~QGG-<,( ؇C- "ݕ9Lڞ c̱|:vQ&I{ɱ6$Qs-9˝RNM}J l(:G'Yz*;goAdmS#:v`;jR^Ԛd95z'v*-GfO)p쑄[pO Y-wpT|}Tw!>wx /r< `:v|W^ Loov.`'6z;(圕%dQ,ād+Q)33^UG's}m7b I -deEwS!y n#$'Z>IT~ujίYyQ쪼 ߙ^OFE%O`͇Ycs]Eeel]uArh 1<5SZU_DAK98g}u)+nq qցc`bҹVƉuls{hCk%:}CLLryå 3jg>OLj9+j$ڶZ)oruҝ즸o#XK8-`7h7!}5!~Uʍ4`boc,/DP کsƑͭ4Z牍-O^WKl3&*. g8ph7frS^H99Mt#ia@Yofm-{yJqhye&#QHYu2LwZ D佖EMFS-VA8yeV' IMR @Q|0arroDuAǭjlMyZ)gmw۠fnL TqwW[(֢+brK%bdF/cL.מrkCAmzvZrW7}QMJˑ_ p8<^2*Yj6 Qk %I.P!`V0/2.DB,83ȅ4*L1S |@N@kCPF=r`iLvo籞ǞiJQzHLdu4&SI!RA<ֲisZi=vȴs#4 ڨoy>UjVm}]rMy~= LekV/3;3Jq9eC|u(;sMí✜oXC}6PkEl9C*#sGj_/ų(٧o+eOp|vb./ x2= H?p xvH~xڢ5֨ >Xg!Zjڌ[zu1H_D&.CqⓋsadY7=%0 c1["1NMo#&D]2*׏7We&OJM !MaH`}OS|v=hq9w!C 㘒1@!k)DŽ:GRZ{m?|,ϟde,Fej.tUr=cU:3fZY] ;[/aj&wVim DEnaJְw%7ixT,X?p2wTyp 0w$eMyEG EέRȰ. -J))$fg_wqHhOׯ6e|z<܄0*"df%zf 'ƤB`i4E{wu.7f}lm%v̀?阸Dۙӱ,tc:rtZSv\)~3yR85BEc+S_h9>y9r/q_Q ?k7}7/r<[ʤ3m]3u7.@|KWfF^80 [|4y5È͂,Zy/b-n]{3"ynjz+k~ߨ'_37r< XqXSu<~3twubmӽ?o^|9;#+TgA֔p.,0; \n}ﮫ)ϥ.Fz Ґr%\\= `X\=\LT\=ZŸD\=JJ\ak^=f*y@ fLe(С{qPr$zqaT+XPz0*ƺJr*A)z%+d$`kW*+V*A9dދ#'Q p"Z+)W/P\qgp3pe#B?w'sAMA6#DP کsSb4Z牍 W<Ω`$"&%ɡU3FL:- Hl,ǎ#2AmFhBx?ތ n6P9 ^,^GoQ- +v%<1`OHT#WQ/\ cyKy$$Zr$M}cGu*Mf|p[EKapkXt 5"fH X:6HF΂E#;^sXe[Xǫ`Se ?wUmFPs+ՋZSaN'җ>9YW9Yv9 :'D 3-Cq7ꃎ\q ޭc\gBu'![n58lXLjAZ p}N6\(wp)BYTlʵ6_O`G*VgQưegx &?^4(\_O{ȎFsO(ÈJE2>9-h+D<P 5PDpJ[rdrvC4^"Y7vn s&KEd@;bՄ}{ɸ] UNwMJR !R8e+D*$X1 G+J*TM.$k]诽:̖^|{|`Ҿ Fif]~4Ofe gy_Ry .ǁ3r<'$&D]{}EKB*PFaJpϣB0Hdi/Q"Bxɣ =p"M PA85I*ʃ2\p^CQRţY>ǀtsAZAZA!v;(:l\s3k̽B4BH`B}rKDi̮>]V%7o;v!c9&k./fY{쬯nƣok?x1 P11438(2DQ1xΒj=7Ylg&=ڿ.t-#Z8bwvr 1~3%E).7jo.\Yݬ10J=;e~FC0iQ/jL MZW+kY?Ik];7N۽Z9$יZBuft:uA'g& \:dj%97bۥ͛x#%SZs}sd_,㥾?ځ@7mJ/[3ۘsQc$Y(f8>'H~P <0j|ܳsmWݶfEb救a<moquwH֜*s>E{As8˕)»Xs'_k>V7Ƭ{W1vy>1gVϊ?@D*u3ڕf}[0w`;s'AD*ιYR#Ql b;6댴|RY˵pAb"&ygLD_\+o>۱o ]\~-Kupq̵l( D cr-dw; kr#Λd3!޴JYRY);ř%r8PdH54,EXfF" {JsJK"'q9{'.*/na\ mr)nI=@2&{SQZ!#m@{-?ߨا4F4IBAJ(^)0KsELA n8 OrH: !d2DQIJ,>3q:cîት<ҔӼgB]ԩ.-){ [GLA \ۈpGFg5Q $\%Քx/(U4.1)h#ݎc+os<&:cOv >9@#$z8>!ILd j7k|FAxt 2s C1:[tN`v^Te}O'Iz\1R+9gJiꢜ"ycp 9e%|Xx8PR1M EPۨU+mTb0jx" IPN$ j^pdrvk^]%J\Y/\q^JMK|w,ه߯ӑEyiV^p]%&JKrJREw9E "soœCًC,$l..#.F6_ZY/f!|2V|j%bu5~{nr Y" b?6ۏ(3ꋆ3g-ȹEJ}}2E{}єTym6j}T~uڭ&pԦELh**%8Fc";hj7W$ZDj QjG43Z⑀sA2(Op/@PcSǃy:29.U逸 wz&OMN=?$%x) F! meБP)X  Uh ;IEwxqprVmZwq4D\vn-{ߎ Z(bi9 8@e 0`c:#Yx/qޱgiMܭB(G8¢fE"}4*PAmI8M٠5s3bB\h 1RHp;m<8v(Cd(B!9{K>ufY曏.ߵ|,*(|_b$\?E#1٨#QP^Hi`$miҨ $x^*?Ƹ,˛eLȔ=w) ɔ}"q^ޟIe8Lny0޿g_鲸R$D`yF@yކ9ù'T9K%r.ne.EIv)nq`o-IqzB'Q:6A%;.H'K}|L>ru~C^%~z+V޿7YǪ85 4%\} 7=(|qz EHWjDѽޝ࿵+?>N/|\Uɕ9\oa/f+ʮ8цNp8i4w|۰h*0 q8(\Y߳ݟny8Mh]~CݵYϚ/:Qq|co'{^vX7Y>pRM*'o6Y8+d?>o?~O?g>4F<<!A[_[nsíֵT9ۇa[sŹTNz~#GS5 C/,ނ#]ςOѶ Rt , 򳘳JB|}r޻Qx^w[1H2UR@'͛`pc ` CY[uHĈ@+Ҟ`'WF7E7m%>GEO-$Ȅ_<`^p+!2KE!QޑNxUȗUN; Jicy84'I<3Fɬ`d (}A/[" D xBuruRQ+)xePԺX¡á *PRDJtTI))"%E"RXRDJHIy6&8b"RRDJHI)q%E"Rؿ"RRDJHI))"%E"RRDJHI))"%Eu%E"RRDJC1/MsK44- /K44,MsK44-MsK#'ُ0C\,gZ".(p6lp{m5#ɓq_ՒeY/K%UX5قhR*Pw\DC#Yd8A#Z p'dGV#kdV8*r>ϼwNXࣱ&'  ڨ"٨h%ĘwpOxx UvNa&֯ab3O^3gFC<P!GfOSdRN.Iwrb%WF]U<2'01dZ⥷$tm@u# rI֤ dNj UyeIx!&霙,KZE 71<5yPĺ[oJ,k>yrTXl l !HLy$徏/''InЈ ȋr 8"I= !r ;UjMB'[1\L_{q{C4N(7E-˔w׷T LB`u;/^fjsskr<Sr))fS18-Hgՙ!f.{g+z1 _fv_Ii=Hr&ɧrYl Q!4BO>wEN3g RHȲ5|Pmy9 E!l-Y+& \Ha 0eZS*gW#gaa]uòp2eEk˳=`CͿjw%~ElmєWp<_ĵrt4et1H2&f8jL0Rl=.E38bJ8$#԰oŔwҵt(*6/ieMk(Ջ?@\?=w:ˆ Ha.r ^P:( 6؞9V2  {:+$,K]F2@Ɣ Fa7AMu>fi@Ls5$Y@NސY(U& ٬4QE|@RI1-R1.e7!2"IZ'iRZ e&dq\ӗFEmU*4MyѤDlB,υP Q,Ky_&[i%!BsGS cYW5/|bҵ t^>p+} #X5ߋZK dsgźLQoPY6yzB,;ӿ=ዏcb4P[DCn;1xx?'<[;x1OMEV=>OKͧ?57c{FN`w{̎4ގ&#~AO?؛%5qb]Cn= 3RKmR[Ev{zc;X%ݸXC4)4,oп+mE-!JޖЌ ԃΦ"o.|'9!%BHu),׹2d yf1!s5p4+o "u#sgW%0kӕ;d<{WIosAc:ô=t:Dz~sF!V<ṷ~-v;PN׶GZ $ԥęQ8R*EB:2-MKu2ѻK?29/J??'-ݾh,5nS۷y$r-iAIMykrvd.ޘ$B.l5G7_Frގ6L:_̈́8Ӯ[g3?謬=ٰ[nK˖iLteNn:9Iy*mdQ@31-! t+J)* ^=F'! $sdv s$t\]Y:xex u2񌗟,av)DԋLjm|78ޞJֶ+2z>hS2:-wJZn!iVG.kc%~Er+:z"eG>kК- ,OGWN:Nnx6x`iFԊ*md8&n&&l&w[Mx1ԅJsƺQtkW5[wwTAV=s0sdrKL4;x+ׅM'N:}S(9SL\3QbH&ez&ԢH= M6Vf >JQ gN('Hʓ8\;-(zgO8[/#%rTR:dK5G I3zʕ M!@ 1K!*WZgL"YZΑ* a 2 FPAym^ }HXo!:)y4aXu2rz{Xsڮm5rD(&tI`JGD֖v[:g<+ AF$$?s -d1PL:8fW Ȥ1KSɊ JU3GKn)CFN@)-)-7HB4@4B]PgEe9FΖr$f;A5ZR<^ALs8}N-YhFn?:`<2kpu"PSozR`Aɲ^$eJ"SWRT'nʑ_FXDdV5O[֌r–b@@*[Zb>NpPy0;QEqZ5xDgj{k7#m̴ĵMTHXQPj+*J2hx@bpi'm""H0̱X۵XdWSO·xYE0(QXd |/DS&NMFx52(Pr٫Y7 `Wy|onC>E+:o㧹)[/"Rz䔭qS^FIzO+e eu)[zĒf.a~|/]F6dd.DA7t[opV&p᳏4MjR'd͈ \uuσͧVg'd9a xx=G'w? #eQRa2|l1z E7z;69ѼQYǃeO6jɂ$ zX-crprLGyL7Vk.zs^𧋔2e2MVJHb9iGhm.H)sDbM\5kzƛ2'01X{/EVh %&z7u7^JM?F.J# GNnp].? _w9Oz rHQ!d=Oh^a= k v@DQdųt_b7M)ъZ8# fs]\kSyuw{t3RW`mF]r>uzBzJB%QW\!E]jɣB|yN]uF*{N |U!]j9uuU43ՕFgT^=YnC _~?~lsQ>? W>'j#ߦ< Ӝl0) H^o,%W??^|%u-#yQ2jDfZl59jo5/{6*$JZ'k济y=Ѥp_fq[(QXܴ4ƣs`r6 F_Vv#4B{Vm+vI϶]Z-m'H@G/ԨzЗ:m̼|XWWs^G^PJbܬO/֒^2,n:N"cj 撌rz-TJ޻}˝α QsA4p4qcKўYՑj)-h/5`wEvhb zAyoi>-aDy%,wVz0 ;ӱL6v;mS22q43rvQm.Uz˄.s=9W;k[͞#bjdW 6O̵5vxyOfr`W7ϟW׳۷Gt&;6]=~+6 qe+9vvz|xsg++CӾnXw܅6}u/wmMZ{-axҮSh7ow9C_s4=IъG6ΊX+Eq׹KS.#m:{l })/ °ϳZٰكCJzz zw6?Ȇ/wFmI)U(eU Agś///m> 63i7{r+ 7} "z(j 1 [ ,Qk$gJ:\εg +7Y @L~_&hqynn kvri~-5PR6 =U(<%ghB g3kU:+NG>kcb=#-;V&ZH8}4)HFKQz F7c0 /ȴpz(_ u9OI^fy"h^I^Pvǫ" AЛZL0~R'-k@rʣL& c" $%*Œ@A͙1qf5EQǗd,VVߓLp5mRF"np)y`"5jmEZIj( ?V1bs[vs u\W5mHd_SO͙{ij4C$)dAm ߿(HDٳGIXcYg(d_ЅCAЛIV GᒺYl 3FVK[3FC",\͹ RR~Lji)ROSo1D(]핐"9e} n9 \{<{>7i{D+G EDjι Mcȥ}kRKr]{ i}&0`5L;fɴErY(כ,˼TKLo7XCx) TIUvY wm~n54t2ŌLvu\>0ݷody+'|!lI_+A{a ({CeB !a}t"BE?:G+!8V[D-A HbLB_S p}*ZNKx!)H/B#S)M.&z⬐EEGx)G $C SѠ? k"K I}{,€GV&^^[oktfu>fhlqhQg򿅜 p"j@]Zs1z'8A Auj!0WUyU&Rn,'pв"Q=ZT&K%uQ& 1 Q(4 (c$rq{c89"0i5dhzQBK\s_e%ݠ}TB@z ё\YkI%: &9 $GAk`ϡI7_uÛ Zk\M\)m ˩ 㑌2X)@7{~=B=Q.v!E I:nFEbNSDTI}4JP@$^'Er; =?jH0TB˘@I >\֒ld.H]^P'u|,#+#YQ߳D3f?7a[fџ9&rj5\Ck@YR%̓TޅOyQfɕ w9 33àJcޏRlZs'+{CQtU0J('䊜f7ޞL(/O`hſFpi+ w@͢iGr!xyY#M3K4vb|9ՕkDK[M}tzyu/Sja(Io}ӃYí7+.Pٝ)Ѝ{O4_]͖W_ۄ~]fz1êqܳi ̥}0J av%P6f|p%j0Nv'e[$;]݆Ɗ۬ʯȆFFF!3& DO9L %e(Vqiq -d) Ftx|s2}_5 ,-}w!-̀1fn#ݷ%/ufT=N$3Bi1h"`/Ixq I H2hv㫿!Lv!GTCHUdt/xŵבKTSё$< ǝu<9E<&Vl`dI/;"ˉ,r?*ruT Qh4(gG=2) اCmLhze:תS|P4>eMϫ:)5W~4_c\9{:8A|]jG.Q+aG* #2(PR7 J*5G$UŇ:]b, RTpo-Ot>g}Z?דKWe-?4w>u; j|n3AoX⻏?u}]T+v1qN: _/G& &ؿ?^ٟ^" ϩ˕Wˇ9ClK*Ĕ#"aq>D; }AZE<#mc"{&CDTW)RRDpCLDK9 Q Ƒ|HCcmT ڜ_C+<4uWŒGsx&lѮXv~\0=_5'ze-+$zU41kS,߾t! Zu+!:Fʙۛ)#(6ٰݑuKՆ C Prޢ$nGUT}RqXt߈IS2`K.-XWU&nξB\;C:$*8ʝ7\ 3f4j|HbaJNqe{u]o9Lw]gt*iu[QNd]}YU_[~-]|aZA1hJ#/G*j'6r/9Q:ْK|Nob%W$i"kjK ^Ȍ @cL[#gflKi& ??&Z\߬6>]zv_h6U4203.$i* %^F^ґ0хTm1 gd=6$,r!c!N p4H&$X֐ c}9:4 [Z2eɘV!!3uS/IN Td \riC,gI˒%퐒vv[Bp[jNϯ7ިg- rl4Gdnp.`9VwBD;k(򥖜N)bxЄ?^Gm l{th&RlVLS=a2r^{9c/g,{9c/g7 9{9c/g匽.3rW{9^{9c/g匽3r^{9c/g匽3r2r^{9c/gO ft긥"ya GX(?6tbI@XN1l!(bV0mՋ9tGg \|}w*.$c%oA5' 'y2-M #'FMX_LPQw:~Cc0n/(SQJi%꼗 tԴIIſSu-ʸ2Q0%sH[ OťAuDhk|cVtQ IUp;oB!cF@1gz4‚SmϨm-ǽmè˹g}>kL˹7663nass.hBŠ*dJKKu9ɞW50rOF 93]žkX,!kpoR-0͎աL<*=έp=(:o;cƌL"^ˈ&Z ihi5Y %lܶ@[idHjNcB <* s~ :& T)JTېܭYDuZ/+;X;"mgF=ό̃eX7 k I"nrچ)Z+e:( j/!ؚ"ySv7 p#]ZJH1I)N9ťoW12#$3amk٬[RN#<돭>}5yj1oIЍ_7(M G|xhd@a g'\BITH'K #a:0- 9ڈc(2s#^E.qVq Ӊ r˚d96FrsL(QmGs+cƒdXDcZ)"V =fx^eX|1 ǐ0#-FE;0&iFE=[4\_5=Ӕˤy{rvˡҝr¦=ڗ+{1>}>j(Ձ3xo7CpV)XҀ!.0ќf-N3vg~A%+D"xjHr-h朖KuK"ִq#S +;״_v=39lucﺽ)O6컛P. 6̩5|d.&l Kh]?.Owntfk>=Uв=}jzz6ySagZ>~w>86<])ް{k$ˆiU[kW{esK;EeYnܜ cI> Ȣ3bW~l׿NݭҕD Zl^ ]OQZlj^2 Y._B.+;K ̂xYg ]#WG,/,Ӆ,Ko9m: btq{sZk:]qK$<ْW{RȦm7r 6ɴF1m;Q1>-Fv0m$gz8?i5 x:>-ڹp9oك" %H~]ig[lo :aX 8|Z (CMv3ӸKb|oxY`L݇0(S\#51븦nzqLAeG fF ܦD:)ٹM@wnS6A '{(d_usF]%r9>uURޢ=M0s?}E1:|tsשθ՞6םhX{Δ4= N7HQŚl9K5cǬw* Lע KÓA@j*A[/ߋ C [{O.3+ ?r-M~tw^Tc&^fk "ϡj{k:-2 | |a ,~oy lcS4^#+e@\`c2|. AR 0|F퍋I0A(LIRlL^S0LY!A ͊uP@#a*|gT.aQr5Z}K? /!ߧmQ]Xx6Jkle% pb#E LHI-%Z|yjKl7]@=fWv#޴՛*>2B ?J,լKy8IwZN[x5*9N/+c _7^D-s-%;JḌ)WU?yrOZ4&ͩ[_\ur9'v @:p LFAYY^Ra"a/=M*%Stĉz$ع8հRYI-$kfo bEmB]C TXXYDSHÞx8X kE\ulI:;i{x[3K`d(EWR)TDg4N)DBlN̄H3%Jl`g]XaR)r(YvHKJ):t~Gpͤw'z1M%LΡxC%Rmͪ; Ɵo\cG4*xUZXTgMA.z3ʖ g`;x/B  JBIC`2{ ?轟+.%x춦6ljɡkOLzK7y>ΚJT&I+cP4fpn5C+OǚT󱞡U1J?LFkj|aXBAǢtp]0cWhM몢)smj6 &rr13kv+gD~iK"ޏ>0k>Eg,+^]^q`.Á=n~\UV':qUV'f4 cag NJkŕc+-q.K5CQy9uu{KF\K݉.{}8:m 7b&&m0 B p2t*c+&K}^W򕾫|*_*_+32ɝDSkS)4'tƛAyaN|}+ʌ]wZ]w[Ǎ6D0#\GR(lxpH*mXQ aV? ~2ύ/rM\GW}و$Vn02DŔPq097"nWvTX! W". Q9pGd)l_/׻obwU1tXQC}UpUz8_` 9qE|Ըa .{~mV[tccVD‘R|:x Hg_}us+0bm@CBcSs̱7B$P} Ԫ2F|0׷@Ɓ_?gסRY5K ?cwo:ePZx2#C0@Ř)~Vʏ3mwF3lp{ gö$ØkECr*AǜqØBѬoMJIg-x@2Ĵ' ,|< J&-̰iRqe'_P.~J+j3E\ehO;@IJWF;qu>([61\[ғύP.zHz~l]0/,v ZBnPӮwj_:t?=w;H̻ u*XT!в8P3XZOP*u(!TWPe+@ v̩iIJ-[! 8gB/466]يsUihZep9ku5qc"qf5*+p[UVSWJ;quVj7B煣Sxq͹'^On0\h9 )[n}+c LɅS,̖Kk|]7-ooYb ?|&2 fH\]OT5@Ha"/ƺ8IJCFf20f!__~yfIV);aꅩ sɂ](<׉u@`oeZ.m 9lm7 2%rч~~azbWLZSnU8x~QzuwM4 !U9 Su.Ժ`kʯPbޯ-FK{\wE\ KP$D3UTշӟqܩh=[@ }~mі R`T3wr~Ң-JM\컁|n^(K|^֘V'+mq.u4% ޢagY{,H8&䖲(e2HZ٦2r)jy_a SSV-el(?"{ Vi'ݪ\5';sDw/gii_Yػd>]EB KpTfM?aj\jj]1gJJ3Xm"x&""ӻ&/ˇ i* Gs:ytdLI\Z@F  O*g=oDMiU 3x(Qv9G_0mbNҪCuclBFMt6sʂ J?i1̒`:>GRy)FR/NP2*(G  Эw}~'"~Mqi*Dwѱց1|,Ɗ{nFHfbY]_ )E*R iokIpszO"۝|--mhf#>.z13G,D:H1&2RV UDYRM~ΰTt-(_s*}PRK)]N8\dҘyd>}Ymt}*(u|]h_f\oxS Kϼb3*"fh~ѪԠzeƝP0{:sм+W_+xק˱{x?y$%p % j}.!Mpj6Ļު4%/q$ e:[n abVOS8+&X]X):I iƌŁ!poHF v.q#֍1z2,moHE#Y+更VD =+jNh"%Z4tD`,[h9YF Jnc0+=bl:_gy䕿^':bێEyuqAƄG\]@ȧUPZR R(,|8$a8R<1}-ISCfܽZ$9[*abb^%K\RRx̫I2/SBT$% jMm$;[۾AceA89kZ$PIȒbeKxʲZg =>_juT[4e1V2 I""`Tg)x0NJ\j<( :1vJ(s: c Wq -טHT`E_5,!d3H PN6 du L;:&PD7% T@2*PLL[Vwj i_ZP"ʦVv_D3s335??g*͒0)N[.& N,i  Cly pٛaUF09HM纪H9IocP<iL|5P85?s3+;&zU̓(w{|ic\?Q0Җ_J*4csA9]<+{6Fi}NyOZ%tq0 UkxaW,=iqyYITփ-ZYcWXy|qn8 aEquX+~獌}{kH)C+܊YJ˭ؑEr+{tAL/lP BL;yƄ I"Ͽ" ;zxwrOdۍl.QfE~ Qyנ=Оo}>s ֵb"TV9.&IO~{鳽KʘcV̲2/זZDBSQI5@}K<ϴﳯh@c t)}z4tN`v=|QEC NF7>`e,4#81#9zxN@+^- 5vná9dG5FnSe0=ʻV9U.6]6ݑu""!Zk6,c"QgNǵWw)!ܭiE T^McT'Y$up(jKo]5Hݛ`jB ٣7m6ӯnBeZZp-"39&fGԙ^g4WoI4\$0_F߳_~7:{>LY `xgfZ0,N/"}N]j0 $f9gU$pۋi38r@iy2O,m^x3V4Mn~ ASuMJW"OQ:s*EI|ο{e|J4}H9/3.4ϿK"$2aȅsrmē1}tGRIQos!r3}6pta΂ŷfy(M 4e q還4*b/rc+˥xu8;9嚺ؙ,RĽW-yawa$%:+ƺк l]9!%5 {\7߷rl"4_>Tf+Bi&>O\4+` 8pZi0`Y.0!)P$/r} EyiDM889o6G/F.}s]yt~`6 '@ES,%6uln}N[anN3[Ǫ׬~BPvMt@_\ iw`I=m<-E )"XSA(%)ФZaR1ֻ`>sݿPfܽʯ97Y"vvC#2B+F;#"7T :ulICZ1m/;ҭ?zpn_r &rh E&!$P- +&UTa{[0w2K5ʖze3asspBPA>zΊ? $L#wx31IщgC8!LʼnWQ^iJ ZZ4atunrwq~wI UMK-w>H6=y.h,쐨=w(hv.aBn-tU1]HDPd#ymaR!q`1OqPn1eH-:둨dZ bA(w{c\3KC!Vr]\X~t|o "c\cC, J'cx i^xA=Dc43I9J%$m30T[0qY11Bt/&m"X9[R䷫ڥLVN൘QZrep˱bSFb&$ǩD*,Ec|O_'i/Pʖ1Rn6umUQ\Lb" 0㋰q-A6 a9~ړ :0 A[xSpm>$ 8X0DxeB# Mk#*ec 4nHþ*D}ho^|>0؇gwa?wj b8*vͪ,NJ tO%Ϲc[h?waVWcڟ7ct3tי3,O*sTcYkBkyU(A8Cʿ`g]Xo"}Y1yK[iq̅bf3D6EJQ`;w\ߜ VP}KTg2>ku.|BO}5-uex*Qeǰ7c(Vaqլ|J\)ժ9DyV&$ 04wa@TYaQFlTB*6Mݡ.؜1ChyEi1[DFKC$OS냛P|Q xSʾg9H%}V6wO}B? `QDN}咕h[-Qқy"n%x.IF~ѹ4\Q cLk&??WKvp#={nu-B<,; b&[r{ʘ0AMݞUz5hA RaδjR>M <È*i2 #rILIB6v1쯄p{v~w{{ѢNc_v ډ>^XB8uk[?&sU$ n@ APŪ!CJ4:|! b;E$IOyQ^%V[|TdMPdEocC0,"Xy!uM'*2gYR{S{?|:]=LÇPd} @qo_=t '|~[| j0yK_dg:yZvڄ/32bjli| Nț)CfXT ~UDe\]ₜ1L? 8&*V2Lh'ˈ:Aq 'ʽ4{25Z{[ZLbѷbn 5n>j8K3qY2'sC5Mh$1XI]mZ[.2kT/ƋǏMtŲ}nTu|wai0R8?pxO~^1T&^qJ nyBRszf.fh4 |γK_hx)Ҏga_'!Cp\ZG+Z^ͤ28i}ޘ%$g܆MuUbyprptzɱL>_\gDsNC$$D~@.Y.އ*1ո5T(+dZpQ0%.51[bWʼnߍq5g]QA>e}Z}^(ߺkDHYЉ)WBTG)Z*BU[+9UYy_8PNaL-bVzo\̪<~ BD 5@ဨa aیTmjD9EB0P"v=^ilIus>ey.i?&'czfSV)WH<\d g\J%ϦeP&܍kB0TG5J+-[3{@+s%DAd%q 1'eyT}ryo2pjvp9 3Sű_ >Ft*o^E2E.J[5tЦe\;Sj<=G287f:U)Sf"c \i`Kd)s',V>v*;9p F%bpfJ {pȴwMKa ;BuR5ix*Yʽ&kƗ({&QTl-u>$KiU6aDEP38L4`WlRPm h)=ᨁ ƨ->QO s SoCXYsR癮Cig3";WݩiOі3¢J.~lx8}̔eRclj˶L>ܞ$ْٹe;pf|Ǿ ;$X=#7 +w53ۤ8Eef~u⸮,)*"oᙏ|txóvt \ƍNQ.l)xM"JKc n".0ܿ`)9"Eڂl>NѿNS`r8IB:|i x i{3;x:n\ .\"FS\|4t[xw  ˏ+:N@n?C; U^w(ZOw)q,Ͱ$cY^jre9X]sG zJpSR2|/[zA/{'9 @*{toͤIr^WY܃%2RqJ+hA-g<[B3hx7-˙=ӡ;>R҇vr!g6׆ڐvmH*Y29U!Д\D綤;k wS0 =( =|IfkX$YYǔ-Θ:"Sx˳-g,$yBv9h~OKWuwtZx]1yEZuDvh}]&>jfTH1(F>kڌyY|°kԷtvaܥKbš&t0i;$|508IMC0)N//G%4=I9ź A9cnA4/%ѢNYE\SQ_)$YX2o 7es1f)!^I\'(8ͳI`Z` ǀK|Fz"={2Z^ZEEݙ;ug]QN{ieʂIN߸8P8up0b -ECߟ!T[p#_<ڃ=>/?v9qÑPeEf"͏1rr_.)߬k;s1bB;4K_!w X);96;ln Ik,G8X`'5Յ$VO cYD>CVoK c.0-+,q~*JW SUIʼ9͚FQYXy_7 EcaLХ Pg fF rAįtrnrLgA8aN`.Y ]VQP*4%Q hreDKǜâ,A+@,V~-}Ub݆-ܙ_kZ#ݪ W9|nfǠʂ%3py`CmI%<\I)`}rFV'/x4hZ)|C ttF+BkB"LSfcʳg*ħ6#P.?{Se_6~Y0ۜWB03-fr|: T? ,ѱ!^j.qt:pCS ;ͩsZoM '2p)z[[.#_ rx7,~y@|t'^}ψ|Z`/E}S irQcixO"SZXxѰ'fZ{_n#/Nn~xsV'漝 wJ;x FriƱַ%ˬx;y14kpwrRh\Ёk$U9yu6b_5(!>tR%B)G jh8b_9^,=FT~-nn1"A1K?Sk+g_d] ٜ>ESBad3m{hSuƼC-ag~XVpV~^.r9+%eJ&PD^%DG jr>ݪ!/c?GS5I72<^yMhɞp,eC [>'#2&#%̽RR\ڡr4Lv]ciϙT-R-^|\zŚ/UGIVq.ntޕW5D$uN셽r;D 6rw/ k~/~0M;"BIػ "[_5#JPL/fR[ߖm*<NPQ`JTs}Nef˦EwսtJ5J&ڒ$q0ڜDIPe,Lcypфt) #3="w^xsE,ՆV gdj:۟>]]âr_9 >y4ʿ)?Tëa#ܙPbhe3戯 '~gb}~FƋq^of~Iv&H9Ш|x\CAU} &-gH@(!=_0^vg3#Sh/i pn8MaِXqtx$ZO]Bkf؎YMU޵&\)}k\H!YE+EIDU] c6*=6np7T he39t[`@p!f{un̊9UY3ģM$>E0J"jWDap0L]S~??.>,ۧeƟgp~kũݦ%q*2$s e=8^got2.hwbMꬪհ1La+ĥ2z/Sj&8%.T6q6>7 LNe_r rfZ1iW1 I;tgr㕜c\MعXȤ8x#4Gf 9X86w9h츠Q\XF߯&)J,2ȭ@M}I=簉ys:sBiV)4 /DUe&8~nئњ+h`o׾pf X0 LU4u^7ar\(hCI{RV\nd KCSz/䰑:WH[b"'%%(TNzkC2 l;=+X4BH B1|<.Z&4|roRuW%I7JmZgU=Ilٜ3EY:W ݳ[R휢3Ep^q*Ne1w061;uyGEz%u^h?N~-Z L6iB#m\EAѶhž8hoC-nfߐL^Nj:e-j/-ɞ,cIMVO}{IG#}mqҜ]|Ual:0 w*![3 dt78fjjl純vpUc6x \<:b0j<6 tUO`L,w1-c}љ(~D^HZl;!qb3 )O@vo`)"tSM|wҥo~9I/wK6NIq~=\NGTT\ +bMkHr:0r0'yJ.~^IBqNLw77wq7)w4 Cױd;h0x6(a^Ąh-{?bOO??H)Q)-/4  G>,F|bi]ŷ393`u=d^[z~w|.j{vq~'eMƧdp&z{~|\oۣv wO&ӯ<,ڤTLoC╀\&”ǣws8$ )P`d LZ : vMH:'f@A2!^*枉AI8kPD>PDU&l Kqhum3l,ᢈIFOD"Lsql|EwIJ/3X\)[ R*x]X=ʲӨ-ՖqBbp,'tGL"o&6*xTBJs/T HDMNU4r,AUyQ#%EZ2簡0J-Z]<7~?)tb%7A/jsg21hA Zqa5l4ɯjr3u e"OjyKj/X"# 92>a&r-X+B<a R/̒o)Q٨xœ¹"s³͊ń{P9ĢqV (L!N48aJ$4WA&>WԗdS$ݣ).ͧF\n %mt?z{S1Il:}I HR",* (i(4:0Q3>Xsc2ɑ{MQ-f] S)%%E 8ԮȜ+`ޢRZ7!WyF@v>$@"a/8x~X-U54CV}2,s)tE#x'MFA.i.9U+nS!Z&1>b,c`5b)^UVSs.j۸w,0-W5O<uDg|f;Mq4Nq}ȱ@sDp8LP/iP[9ŷ}_9 /GWӯ+,ie/KhzIJD]֣m`@"EV ptu?gt9όZbx &ȏX٨Mjal3treF9MrGɽɄ2vF)܆^V3j^|,{!i1tc y7C͙&%WN5}*}ys~CoxiRz2gKFlh:U 5S |*mr)tSZLMcmh,oyG@oGQK)qʈCc{$GG-雭Jg|])Fٛ>#9\LLxA+!(ʱd\ue'z\ݰx/rJht5˼J!e4ՐCKG3ክfb*n[8D{gc%2pcxvY@k_ XY6E9<Q84LsoŞ \jۃ7r :'pwl-4Eskzs+;LaLA[Ɵ 9;ƥ@J-i-Uܵ}!Wv- /~YYIR4+eEO"ufk]MQ5|dTVubt"]&)1vz;Nk!׈zdKQbk =OZD6[]Twa+|cd$eMnN`Pb∡6w5GRd}[mm톆)]^HC&W,U8eo0-/+LuϹ~$Ie梟CWVL 2U;3 o}C2JnmUOۇT T{b\]v0.&W; _qT19Jq淦FbR &\k(KzӒ-yJ'`N -[?~:IV8) +M[Tf<:vR)JVPzPooKMRMb]t%:.J+lux6pԴ݄h'e8D9g.!s]5oD& 6E4Egb*#Z,nA܄'NJ0?NϤ50戣\*QԜlj4i՘·W'nN#)usf=UVMD3Fƒ8G64!v.Ԓj|(M[/HKTeݪ ՃV c3{*Uה[312>n'E;z0Ȭ¡aMswIܓ؝\owͤ<ŀ4[dywfnf%9ƋѐLk~ӛށH5kNY*e?. ٞKvO *RБ&Zx;P؞!z/MVqB[y ,!T)\OKXp\!nʃDncܐ<4G>s P ~P}~,'.[Y4vYs:R;$@ 1A1I\/܂3?~ 0,7(a^v +O>F?HQ0 ?{ƣz?O&דǣ[eԽX{/O"eן@ڗ}ZrdS=ާ_-,>Lzc؈޶Hu{/{yoqxD &OGK)$Gεi;@!<6&)C)kyFx8+5»')s? Aa"u`T]ok6ŞaXlO;";G}G,A^E7 9GbR\A'}wHxIrEͪ=m!5 &r.7Izr} OB\~:bxɰg h7wtlNr1D5\A. < _^Dg Nc&XQ}sΤ"ā R9SQwkG;οQɨW7-)XVyL@XV_Y*|Љ5~w\îwT>%N~N/,Gel~8i~_ ZCv|uIqFw0Nf6,2'P(K`q !D@}h]UG{B0 ]Dcmqd1'p©ĜE=\N9>-Up0}QQBJR9%10HE pd'j\UY pO0u(Q2͏Q*woZ7OةL֟ zt0f3JӝN(:G:a rU3xNDקjn/RdV$Ϸtp ZH g1"Dۄyb2Q")NT +Ť9 ,ƒ&`s`V$q42 l:*m+nIk3fBrO %(ݔ#+BjHU=&'BvL9M(AanıeTD ^05zّA}lŒiدƩ茴RaS/=f z0҄5[1O`r#,S4 =|VZYբGwn>Ņ&4_J2wH u!B&a/irPY$!18DK4jEi$ {*Ay)'%g1%:E{g)B+of^No"Z`.ߘCi4mriR*Qe>>[ֹ. yL$ "- V&A39X+pX,M01&0b‚H0ס&G=@XLNؠ 4yC41S2ԆkR(p0>/^ K56>A'5V.CnĢб#K588RLK)T4dȐ)2d{eanڛ/IlxESc* 56,Z$q Eg ;/2@1;kr*eʕQΘOqA|jtv*fQ>vgStB |٣h zw/Ƣq''Ir|дTN52}}*7?|)CgVm-&em٬ZTeikR`"HPt!έ@2ۤg7Hл7 j0"2dKɩڮh@76a`3us}ƒY 4( RAbTڌC7k)lgt-P}SYxCxXڀ_<gbS")e1L 0;5иfY<7 A"y) nðʮ;-2mJh *B^ Yy!1X(VMPUuE H>p"~)MvmaV\R)%)Kb: &sZabC 4?F$k/JNyY+2|Ώ =\y&wmZ\?ti] P\7\jð!M_şpQTB_D|(s=PU"@M&ڄ|갷[RݵsQ[Tsï~] E^Cwb.~>{s%׆Eh48`PfAiU )J{16X'Rm!c:\Npbygzq}-fw3U8u_nlf+q _G Ǎg7E#"3sR!n+u"yk\#A[j&=M>lcz[=W#mkq@JhH *E z`kςEVY9;1kjE^z 9Fd"i :/ OIwAc)|,)g?e}XzCO^o3Yc3CHJ/T-2ޏ#,ŵ@Z?+ ǵoԎG 뷢pl9^&A_܄Vɧ{kJP`Qg-|UD/8T{iHXұ53YfU6ƮP5>wUJ"4ý;I΁خ)TG; y;#roWQSi̽.2WKNBo ՀM2/W![}2Jd}x7e6狅]| n<QL% m>1GcN ]~+!^c>>EWXH`||6?c^?W}sC6X}N/raG~d dRbS5maa8i~?Ѿx:(,V.΁܏^tARbmб+ڋwE3`~Xq$_j{W5Tu( Sl@6MhnX;.>e'?ݷ=.B-O'ծH{?x@W2\]VW@9ޙc$]Z' VH ΜSn02DŔeotOEk-k_y?~sW-5}s(PdPSs}{󏳛wr}Wweͭ8V+gZCي6L5kBEa#aDB<% "u0,''4j!My37i)y&UӞևlĀ&Ș7|FNG.K5QpDyqm5qb*aO3\?{ƑB_n~TwW ƻX @8p{1mɖeYcNj߯ġH3İ%pzW" ҇r-Ut5y8%XFaN߽k o1]J#Q{0(J%=Z03>-Ke;xLVR6O9_rc7d ?r) x}'de9xbAq,*c;D[CwA9­7\R֡%a_[Ҟ~ځRy6tR{:H2ាX )ЖRyLԝj@2ˎ//&frKtzד@Ia"IKR}) }Qd^Qi8c;ᴞ1y@tHmZcKAc@>|{A=&$\p)wi,κ. JG~^«ec 1 y2q<=n/Z0#,Fځhc;p6v^;QGeϨbJD^W=?ac/(VE!':$hSq_=lH*VD5ȏ/"#K妇x2^ `\\h|iZiN@lVZ]ysۉcHC;Gf5+-%_o1~])ib2[d3F=?==ͯU|g~ㅊ"ں> #IN4/N>VU|Ƙ'A"J2Q }9]H P1=e6of@Ji書9wɛ:|m~^*u9s~9<ރkYJRwpۣea-~k #(&H %i@١޽5o;u9aص o,MjϕVV6qmйEkVcia99gvHk8tϢUhVnV=d%<={8 SCľu_>}wT0Ɠ-ԅe<ǔ>blt^RKF  Q$X9ƍOizR]ۓ*fZǦW tCPbE ?ÝZ:>};(nՀiOS@{C+V]6Ԛu2PHꐠhjYre9-TuT֘qNxyZӏ_Ҵit8g =f`[]>xr|h00^u;'3F #8-"wZx\ml{hj4F*f_0PB]o @Ѿ1ڷNI|HQϠDd OFBrf_F\5c*'w>Vs5.%= ORt3 nniTrjMOG#jG m7h̘u4ѭAخ@BH7hҰ6GSQi{TG͆Ұl8FeT=Y32"k} ($k)D-֥aԫ4l-FNfL}>?:e~<|`KlK\զ:ZReԋsThs)_}59QoK+k{)[~5p~הloс +eS^*{|UV?&O;~zדq|gg|yw[IXUpfjs2y1"eX:FK~~8 OvMЯ~ソwBӡRAo?m{W:_|'*dtJe-[G5n*`)QumlD+\pMBrelG.+N].2lK9³$Ӓa֨}N=CJIKI@Of]6:Jh"-ȗ]ǒ-ptl S^" ss˳qð2֐qkDK 졫fA5D%zzrp{ 37f^1m6&(^uHj19ޖ SB࿐RNqB}v}6s@&}pmb(@QUtV)!ʨ4zϋ3逖l VfO>T$z) RwBe-xQUof/]"#HJ K'Ξ!<XjGV-FolZt6tsun}9x֞Jcr&V2(ɋBb)+҄1'Z`= ++ hNe8@Ť[: Ċs gS3kl+-&ҔL+x $ӞEUٔkg)t@A*?MJicLKh~֮X3Z u6-P296##a!ֱ -s1͙' LAº LӬ ft] kTl%*-v[44^&u7ԪtCJ7ԪtCP[ C'΋?b3>\钕{R|JTI*}rv3qlKΩұ54+|re%}`C}Y/o~(^:2#G1Wc1MlꂌRR&wiٻZB;h>?F®5D,}[8niZS{s ς\ ް=҃]!]#ͨb*Jh-*] uo !g >Ba KbƋ v 2vdV/x&Vuu GeM`#A5$4jjKA7@/-K;V-g YKJuhdK膐ϴlXITcL?hޘVXfOaOI G؟O=fs GAR-a,MO3B'E%A;3fMK3KGhwJZatX 4΢&?̣'{<5!|o='`D2;H.KqI5d.\GVc{^#Q)WgfPl ۿ5!]H|_mpʑ ! zq~Wn>\l5ʉ 2>6+a4P hX*\ }# G@ ;D_ᑲz9s-;lxIƵz}˳ѭ6jzB zIgʕW)ԛ)9,!SX)M.xeq(6W >0ʸFm%.l٬K!qlpSJ*Uceͥ]2* h=M[^fe 5ybNEch!xI?܃vKZ~nZcyG{[У,A5场\q]x[1䢱c1h4b]Xr :3qFFO8Y tvUE͛Xhl}-5ZyaiɌnYSRpFnvK71G&ަty_qg&RJk|5*]=" By J>ml)6D[dwߘra0eJSTD2U }ԑBU=O3 J |kHyB,%F 'LvZ`hk}ƴu6P<*ʚm$ kX;f4h␞K.bXv3 l1Ҽ.]m[mEj.Ү iz*M*Mf64!z2Bk^Xd :̔ͼ^]C+|5Z0~4!MES{~KGA3q3me!cPZ%Vސ`RuRI`5'o#O5-LBb3H"$'8|DDmPfǜ1^*T[Qw9NK61jv ۆW)zڍ 4uVvo_u4ނhke LYT Wfy$B$bSI؊!^q:k6hR9<'X}V;4탼y\3fK^ xOVNIFEk%~Wd]T4?E]aJ7% hoT%rJV0y 6(d-KUKI'pt],NeJg&DIJH 7#I/"<""# >,f a hQFlYlf5Uj4Y>$LUWFFF|q މˎ(}=xEya̶{0,&tƻ%441X:Fp%3poГܗxˋ 0q;bLr٣AvZ防-X}@ΑW#=Hi.$YxO[ͿeAe'5IpTzZXŀ;q*pH}uvl});(f> l9 Lw؇AeͽgBsA;=,7N`Вl!ZnxBy"3aE \% ߼w:)qr]=·-N>x҉vhok<9"VqCZ8Ä w&a.1Ƥo>{mP7шc<碾O~aY4,sP$W?V#PV+tq!_=G!m6!י/_/>!.lAJ ss<] ;rjKImHP& !(pحaHȑg(эeQv g)]B6?~8/{@]s.Rt@b'%ff/Ǎ4^ c% TZu람C}:qvD U,ʴ%tYLG tJGFPKOT \KHؔKZ窳oj2gLNuѶ{ǾtΡË %P%s?h:|v,T r{p\fhkon'I23weŹ%bHk ` 9L\ ->?OK<$Ve3D4ޒ]>:^.Mwʛ9oCw^.y>;5,5h&F"3mn;QcDI,<FBAiXu25!o' ҂ Fްz^đqk[e5/K덌|o[ŋ`PYA2>w8,տBCoIWAh@2>v7lozwTǓxZ~rVk9%Jzw{! 2Q@i7/+|V *:? Z[P4uܪ;bWKs>o#^M5[t솢 y;͈V9|Lg_Pzc-`1\\w 8ymVMm.N 8 y<0H''eh6\{} ?>[%| 򥨯'|ۣe{2þ@g4P|R̽Hܩx Pev~ ,&/h`4/a,݅hwv_q䙤0Mz0H0ͪJxZwu>}x`;X`7qAEN Ziwr[{2f>T}rwPhFLK&ώ[}#ANx8[pw]x6"Uuڌ郡AU_q|w_S2ؽ\OǓ^ >zvC5ɳ<Ō,vdž#`}iƧD[c8 v0jLM*6PԊD ڰ@E#Fo>Il YYpE2wď+ЛP(ԧ/'?.eQ6Ւ#@L~>G'wڹ!Gɫsltf4eߘNrzSջT NNRy*cY0y= .f IOCWh;0tZƌJiۚ <^6Xz1΢=c혟h]X{NW!C?9oԹ^mggNWx[Ot3 g?~Zէ .8n__O:ok~~9|!i,D?u >l& _2an#D5̒ rra7a'd|ͦ7AG]!*e\cQTUNZq[& XGP;؈nDCF#?)dI 7U8ē߸i cW:bהE-LQ`bvYSWYD'srJIv$"\Tj k ^cN,(|r|"琀?YZ'4; DCޚM9wBŢl@3r{shBh6w˴Pj$fo +\IVN:C-ggDFަ"Goxe# >pYY'#<5R$oy#n*cugM2\95Y^ r3&Edz7 2~DL) ",55,D0>Z)Sc}Er&yyeCˋOpЪͩBi([pؓF+XL*\LPwHNXA-|N^L(2d}2y#O/>a6Tg#!:"QQ$@bg"6cZbqgUwB!f1 a7 kt#H+Ϩ0W馏Ƙ]kDeIkD6x186GLoGH-sJ]ns~6\(B*u73M@4/kj!y('E6sLr}0YNWx~Z9ӃӮ<4g-8+FH((ΩdL&b 0Sܚ'[ӏ{W UzHnM/~07_rkܚ1,5c6/kdLj|{hHybʼn5CYN![VBQ1FU7V*D9P␢XfL =ʦEk{g#@ 98!њgdYj{ T_yUœK.4F{ItB(|I+4.6&% AjS Jl^2^Q";Ci ! q,|vp2Ro^@&5L"ϔ C k5-%=9E(߾i[hYQte2e]'6@ vܐRq:.cDQ֥0<*6UoK̫p^L ޺5MtRFhl8{ۨa/,Xxy@H:N!W"!bqXW?Q>h.2w:-f0+2Aպ{N|IgP6ԠPtNIiSDSop?Wlj>(nᨅf-UUsd]I4Rپ;{yJܼf?,jś_tʸf#G54f,c-tY,O֡+ɺvW:͖#KԪn+ ±YRS~Lj GԻmWoׄ,)[MAgeKnx/-TM3e -eKeľ1xgav9e}Kz^h@{1a&+øqK?[O\~b_׹;_DVϋO~>W^Ƴ"Y):ϛB,4wjg'VzH{ܗ>bWvRʢ&徧D zt ?Q7=//n?,F] n=͗+1ϛ]׃{>ypnv?Re׋P<7ZIӾVLρz.dz{.L LYU G *՚Ui-9͌͛ Ra?OqJW}[cz -ВL]BKiBlF5hO.7ZkZz%hWC<7Gl^oW2ZBKo.4kpvGvMLep9k[f[ zv o1@d{S abٞ?b10ohv aӞ^yd>ld&gIKMSd'B^ = =zHEo&{ 68qУ7%#%z卹Py؏,hGM,@;Y7s,RUG@hҿ@q }:TW*C0Bq^%e#'f֌%.|kZ..?*@m-FMݮmLAO-hfycfͨqrriٻL{}LwuŭrJseq>3Z"TZ~)4Gû%w{eG Z^hƼ4#5Zo \yc26Ѝ% ~|M[--N*t^aAg(lV_=2`+(9;BW {~凂fCxf9O&AůAwf&߰8_qvэٱ#N,Bʙ# ZR6- bK6'L䠲 !&lLȽ)m6"T0txeɂ \cc)j)i { 5&mdSN4lp-Oן 5C^s`ܘ#=۪e(ȑkő{)ۋ 5JԉN>kMb c:^,ÚS 1ᔅKi96X Ai T| KM9'5-9h*{/VBkt AP):E* ` !%#c/G~ ` 1n[l,jV\cЗ ETl]zGmsN|wWmfm.,~1&Sy=!b֬w zs9!m$0vǙd!19nDGgj"# cQ`8ho$^#״=9f5)ŭ%Ȅ,R!'%_ƶG$_n9XⰃ1nbY6: #GޞheX>IKHU ^G%v6-;XJ [.cfNldQM bem 邹spD.G!^QlW>Q,V.[Z8~4}W-q`&!c6׼f_8v>oQ)w?xA'~}|iƧ͂GŽ6;ENr'3,C!3R Q[6cӹU34Js@3!ywP5|gOFoR[VTUNIX%|M;baѴ(kNE' )8!qu-wTlA]ѵ FgT& Ta`Jd!ށuZp+8Ol1.똎.-#gevxn!anc{"5Y~Nǁ?drȎ , b%ϡGY ] /@s0u\G-Ʌ 9iM)XGZrFS"ߖEB֤# 6Šf/g/,}\O';.S6}g>Ќ~͠gy{yٛ2yRQFFx©*גrB9h1sҩĆ[}^P;(%u>nb$끵pHa* FxCXK IpۺoMޯ#W׶7-:KΨ}sTKW_/9tQ]_~9k(Մ{>q¶~}Ps 䯂Zz!I`{rv CٻFn$r۴ bo6L`l[7c%y~RZeAVd=*j#,WlFFqF(3t%EtmExQ Jtɥ7=̀ Oe0W"_ҩMPy$jGrNG&J@3`IB@f6c R+ QRnl9<' E$zL5FaوH}#'JTKC%+YS!$$p`vjt{?iKtATLĺu1Z7Z pj|љ-Le% BPT,زyر (Twp*)(09D!GqAN ~ŽGd^-]A@&X(wn[B@fѬ# y 5f?ɖQv8& }+SE:#X(:csz-Gx|!J^ 2蓭@ò.aE V!F))oh-b 5WkTL/r#4 MWc?+7Q{vc2]1V*1{|iq5 wjD"eŁ-l,+2h{w^>QMO[0 ' ]G> 6WO@5L]]{Wf׼p{4كQv~h"O/׌r5@Q_+̂N4(azfJ$\OyOV6j=y{wB͂Ve$gy>5u[Y௳˭n>Ә#Xt:K?HWS?.z˜N]9 nnrj rΒ"% $^#2o2!X+& y[${ GxV׉rc"^{g\糁I>g ,,>4 m|9.3Uq<a.%UZ5u*W ) #  e?VW~lƜKI ziw#$( :BhzL%^NF1%tr`v Ԗ -Q@BR>j+kB4\ ED~3F6)"tPMD OĀaU@1Hz4r111HЭRCoٶ BGP׳# Dn]LS%4^YcJv@uTILIBġ W@W8j1JĄLL47F4tON0+uo{Wƍ_Fukr:DaJcBҏ&l`0%!+P<*UWI 00b8fuK93NtvD rpYh,Z5s!Jp<5$:H원U Fh21.<h6KL j6O Dhr\DG+q_6tDr>%67\zEpQP=e%[&=iƙV8og˜ kA@ $ͤeOTmyu9H<ހx6.غ&2^JɊ6kq{#wCj[-"bà{T]=%lqj{weI b9g,Ibi" nAԝ-%aL1hˢOR_6nԡR n0?,N\x}h~Y/|~8M ӍBI"~? dg&lWȯ7nQ_N']I(ޕ]I(ޕTwcdǟʻɩD vJ31l[EquH ,cIhhSBe`bךa{f.cK G ( _=aC 3Mb;Jo[)a;=kLKfGZ'̊w.e>aC ٞq`s77?†שּׁc5qt?&?uatsi͞fsh/htPR O⯑HP*g[JdH"JXhoc+%,LeS&:̌{0Z/?7uY~E4GmC敻M[6W׫<]Bp ჲLOW LL ؛g0hr,LcRq4w)F)W9ûE(/͐rdZ He:㍨={fm-X8+*O@Jw<[ezZnNl- "^fZȊ2 ת#+\‹/3L FP9H6מz=/fLST o/sE`U"^evʏFo;[e80o|Lc +T*l,?'~:-P"C9f/I݂Y M#hcte.g Z1O_e޻<¹+g^DYtڐv)R)%bS\k*ni@fe8-qVTF}`\iP?p~{~а~^h0\4)$e.pZ B1ieZ+ dxF:*HJH^wX5L4%X" ?y?.G`%L3X~ 񗱝 3~ IƉH?YQbc+U%<>s@4$󄓩৚a,bXX*I2 2*'GځJ"l$ Tt[Ƭ甿 E*ֆDX}!_ @y.5tmzo/A;Ϣj1FG$I&P>nLq̷f\9ں? uqDf3?t% )h ރMS<8w Iu1 ܦSMI)83yKMղ-#pޯmQt{Y[9V ?d*#--ÛАWu ]1빫3 e"}T_d$TzUNRͭ6^fq eP#t2CI=3Rư!)Y N7HjLO]]'Tս뫺m+Ҏ3nujͤO6R`0s3-ohޝO3g\t0m'.CHާK_~$6ƀW~`+3Ny/_dT4~h2>+~zlh*9%=7U½nMq9q0sw!P.6L|cķas m_ſ O1pnm)VDƞtq5EX !P,@IIE1% q Ҏ.089*5"ko XC;#xw$P8E)dEhQ="{^h8F̜ 0MU10S"koC;my3%(.B妏.9yw~qpsY78Ls"Eī{WAWm֜IDuJHAW#^moj/$W5P,(II$f1#knOCe3sW$´PJSNGh]}S1 W.ξzhlHzC`4dbFhP6TCߟ>uZL'Td說qDu啔\eMaćo[^ipE \T_\նN8׷xpys z4IM i6RatZ^O<@3},_lf3*:7ƀrEk5O)*jni;S݁UE6qA& ~h0|Go Ee4#H?7-~d˄ 1 \p>=?2k si jl77?d_~ڔ O0N@n%qѱr A\ 'h‘!9,Ie^/*[,n0|wbyR6NmW""JD&.i%%sHqŰ  \H hRhhtSƮޤFXlI6µוAkXwgu){FbTFS%bZb=;3bn"eʀKPW䭟z>.€uy/07,t cmd @gkOYąb\V ߵ4]  ^̞.*e]߻"8L;ɋ;gˡ. ly#-Z4ݧ6T2:,dK)-)x{oR| 䍠-&*>۫f%fNw & UeCiʽLSX4#'Ҕ(Pj,DJ.dwiULCϜ*;dZ?]|Lk$cyYͽ~l-Z`'cч:t;N%Ʉ>!U48 R3~"`z˜N]ۇ~Vo'N'yY*"KX%gB¦ Qs(RD 31{n`.oBF&E%Y`YLFwlNrl4Ɉ' %ZrX^vf #NzƍDBΈ-ZME3n @(e#k{4Eac50zI2bud;i Kxy'ֆ/d-'g=io#ǕMiC>,'x& kCSb̑hWݤnv7IH}Uz&<L2JiL䓉dx5%:X/`FK#LXjICD1Lڗ eY]lly8ά5 I00XK+YOƽ㙨Ӛ+S} dp*"Ҥ|pA3\ƵZ`5 )%y @qxj$EXIC4Ê!"(%6j ĵDFO`;na= /-@B٥U\>r zgi zg)dN(w }h$NGAk6JQ8 0mNz5̼l)yɰm;lAܛY1h"8]/imѝ҇ؼt1Uyq`ȆgrEX#Qw.3_*p2|X3 q48y{;*?T]fWYb{ZõkA;$@R@pgu%$+ީjFB];zٵ,ImV智tX0F4R;Wd%b5=gO)ℓԧݜ ,cuuN]DYG?.=c #TUlb2J.deNVQ,jM V{ߡZD&`f.VJOO [^ֻX't۫PaNtjQd\`m -6o< 93ߘk+ɷE!Ճ4>ёP UV$|(KDޒ;G'wFߠv"_m8Q[WGԯ;lV&HvXS #J٢]q9fJ*ᤵȱz&=1t'f.S 2!2kl wK []O_G3k!j7^ '>Nc3aO@p?:J?pnjۈ7読JJ2tje)riĢ-D2EYSmPgO`%P%\z`&0|j$o5N=#E?sQ-=Vz!.&[wa:(4*׬XOilQxm 6~jk8_{d?jox-\ƇcRQ~(QUT ^:\!;PG3ͻˤY.d6G_26ܧߧ<5:?JGy 3'1̇hn3ƴfRL|ᛋ;s$&ɓ.uzw7) 0'_߿cAFx,1߰P Mr/6U\j>,McB XzU^[= gZn~U0A^϶XM:{ (F-~:J· v(DDZ,2\gk&[UbA?|| aSvRÞ==\4&6ϭ*9Op>8R/rcH%H$D꾚HO;DoӽC<ijz~{7B7Sz8Sdx=V~8vfKk;{R-Ǔ8 rog=^ǘGeLuL<"O>I"vt5W"8UB KW>H]+/m1clcυr\r8A 0$޽_s|>BKcd7tZ^|p~X#{F1k!NI~j4" v]lyO|&U9"țQ؋q>2p̯gӏfz:0Nfw OwRdL1 9ۊ3p_Mó$-/֪tfb7EPjEKh#"^}T*jsid+\(ց$CA}1(B0’h7ڵo71bj{s3%n %_/ HfkF`m2! #Ʒq:* qd]'<̀mf'n7?JF2GQ= Ki4wY|H_J=QcA)>i'1(3{;qX{!)"C^|ȋr>n:4JxG? %:^!]p^bţ ,v.YQ7:ـ!joAdAȿT_yIG󬙑oI=+yYJTJ55/E/REJHɢd25 NsO%ј:] ֌x$ oRH:$MF``65P+jnYcDz}@wh4Nj. % 紼?<[pdxÿ}7n|8Gm@ 9ꉒZ)V)QZCI?p)F'tb.&<<x ߔwI!Β@8 A6Os>f)]S񝧦͑ĥ +nEq0+y5&tr=a7ºN+=ŷl^I"]\Hk162`kA@1b ,]Qqa-m2ސoHZ7lFd`ʑоJʄ5%yi3*KIqII㒼iBjZ)STWѽ)Tg7Ą~L!Nۉ%`"Ċ K Rhc*GVdh2~c?k7@ lM-6 ,͖2tߺ!@9[&EJb66ZɎ"cW_h0W'k|xm`PB4#9H=VbA3!fҘiD&-DQA$˷ATSOku -^70]D>YS9t1g>p% XZ 0U4`w!m< =5ueP$tOb|A{TSj , ɪw*AROJ4|O_5𞰡ĖĎQ&^nR,_/Znx!u E)PU hB}Zc;LeQiʩ}=MDvpmk,ItiyG#4z U\>bK/wx+n IjmJl2&kMM;fZP 7k-h&J[`xZ Mm=koƖEȇ”AMrEÙYJ43,QeERk}H,Rgcc<g &4m#x3CĔ?g@5/EtoijEqB %a;)VZ.+[_1oUjԎ0(ԥHBaxw!bZa^|cԑ[zV!@:Y/yS!AnZ ӣ*FE~%Z0v0PM&NX0$5= I@c'rEh A$D H3=x5E cPXe+i:2P"* ]t-,L{ ") UFmlnʛc$@1D%^qaV4j!Ud ]ݺܣ8{,.&M q6 %vdM(o> hK}B$1%W ̤h-F,͐$dqRRH)ÙEdz6UPHhlQI]d~BN" ʐeM<LeS/)|Ng7)MEJ׶ >#f .αF -=Ž%XTP;YjNȻZf(ݷk#{-? E,ǝkR eTg:ՃoT)K!xҺ9GSr[hUuyv#P z\/`_RPdžuŔ<(uZ69jXjjмɪb1L s]C Am%g9_ŰaW" }*Zp8h-JPH@ ?a$1i&*OQ$3rTͩ `M^? 2&<p59*WHHv([*[SM?J5C H|t?ztAL'17C&],T uf.IY3 }\gzotYc._ڂ±󝧝wOa p4-F; % |riO̮1+eLci`"$"/1 ljV& È+&=ϴϜ_ϵE1NM 0?lƔjI}U+UTΜ|5`?t~)ĉ,'dur'/@N Iz5i.oE?tя!]cHXNDzuԩ@iJ%bΧH3ļ7&F0ޅ.6*<y㿇(,|(O O:c&bcp >c1(/+j#7,zwRQ̨MɨU.\Xy;PeUw9 )ıi{IRyJ|:n;Cf€-b(hY”M2`fM֯: y HqHIF(KFb IZ2SY #dU _-d^Tp o1' ќSP2}gz_ppmQ`[(KbuoV^Lpp]W fVhLJM3j3SMk,* yωe(Dʱ#M3+THUQ櫮J/ݷkQa]¬qi}Θ1 dBy6di$H Zӌ1.CKt:9E/tRP5i!PVFua]s /'B,sYPef:C؎VxŰÈAAY`zf엔x3Pff5\8c]my2K%COfEaLu z,jyU@PrBfӣtɲ][ o@ >^6m*1U5N1jueSl83$tp* x 8`iNwf D*GV\C\R!yB|(|iwt7R9~'de? ?dk0Y=R ]-JNf<Ph78w/by1Kp-Ǧ{~w_>u''0,dO1FOg0?u\ :S{Y}(_ACo`WK>Uzn+nNCvSps_O/ J.<2匓.CHĬXZ+d觧;w>ZWt9%l9AA}Pd F5;/~en~C9?x4Dq`\iB]1vŘP5d%"{&ؙrD(%7[ߤ瑻z\u;e*h%p`+?7vcm?g˯Ao]k~R`"aKea><)~ uCxXtv"Ҩ?tK:aBֽr?"Â}7 7y|, 7tll`m b<  +޼ɳ٤gX35"Og5tt69lvt7Ξٯ_wN?_ǽףapk ی`һ;&^/wyr;t_Ao7'Eb>95_:;[j:ZԵr2MNW3r hbO^͠rݚߝ+ŭW71[?YL$4_A k O>/[ߞKC3 W>kg|^}2cR@u5|6yCڀ 3>M{?ԸPK̅ugA]χz9fP:]]lNJϓhN._qq7SȌϿ]}O ~S? ,PGl03.."kf^9f%U_Hs9d3K`Vg Ik>ZU /q?.F a 1o:Fwa%b|Ї:^ Ka84OrKQ(ZΆγ10_?svyE9?-OV1gq~'w< :_jqe4.o>{񢙐n2R|nU, *u H S~r5q)C#Ԃ/{/U~ƃ_2φ'~ b="1Hie;bf(o5@mwoxj99)5Ik։1j(@5*K}\<-Ӏrv $a= mVX6^ UP9 ,n"S]nً5bC /5ݠ`";w(w~;ǝ߸ɗw~;ǝJ;Ɇ_Ʌs gʚ6_uîFЄ=cQ iQ$ MFeY,ՑU[R%抛i#eqȯc< ?atFq*%zkh[RqNĠlɽhKȖ_?-9ڒ-nKn-ϿŖ, adQ KN/,i0-%3̖TB`<ڒlɶlKa%Uͫس+$>uyQ6֭VX?jav4$舴4BKTKçn˳:CV?gbYl}Yةq+Ӂ}R UcՃ@32XN0$pdDR[b/N0w{%o4;ݔU 7_xHcTijL+T&D"XrmLP'*% `E4ѱ:bj825^/hV"nX;׏$lE=.rG"&eqnOHiwΓ^~|'~J; …QUBU hxsi<ۜyxֽ; 5~ʸ<F[z^ZriY|dg4#v%СEf^AX V1.dp^Уy7V9;}_: )|Q4n`!a- )T3Z1,0 aogFfc%\[qFsPLy}&0df06%Y*%yHV(8+1ʳKo1y)6o堟6oS:ׄaN^Y  j K!j`{c/`蛋S`emH.4ooMzar?{lٲᤸ;Y1y{/?}nuw/}vd[,~ے z9l&!l}Uwir/ aBs4t_7bu LÌ TvRX7IyvcC%ѶR49eeU*eiW,'5m?]>{ȞX`X_{B`! JE 1u,Pk/&͊J YV/K>f_n,SøL/D`w->Y F,d삱f 8 W<ߕbCrH\B!ZpH )I+]z+(@>_܏4!\a:n Ɓ9t&LhsK'BSSsQ-SyxvJ¨20*i7Al5"lIN`n.z : m&`#2iXY9_195/HͺiɿTP7Ao+G?&/Mx9Mn;$M Q?X|sLPS2^NJb4]sLS?XNjxoffef{b6M8<LVmy <8zx9?hJ)ocڤ>{qqn;'`no`LmMyW'JOf6Y=2]"lRqEH\lg+Eqܼ?.&l( +t\9K 6 BLc%'9', o$αA(EiJ`Qii&HkT#&%_avfY+aAft$ Do?}m$ GV lYZ'H2aF$2(*,n]~%ZoJr_ϟK$->˼24>"0q峾rLĤO}w4nͷT.#f.GN!+xǻDt7C5nRnz1#)||X|zrSCV"sK݊-;5n.ف}W:p(F2^ۓv5ɝzŎYZ;`g!20BJ_sQݺwww#TuDN̹1iWO"~F5 T;ٽQ΁IV.#H 6%n20:H:Uf)ɥ 2rT]L -L2)[7=T(D2='ҡ.s. ][w;ij.[tP^Pq2 jMR"iL#^5ͫ]jfG9ygvT yrYmȬ#]Ԍh?uٰ 0mYpPRh*9/J)Q%SAO>qsQ:&9e}dVg{-`( cˑT/ASθ #c٨Pb|8 B)A#-JDJV91”97 } ҤR sȉ| XޚeBi`$aBTqIpyB p92$oS1cX|Oz h# o-$kL>LvM*: c:ϥ.sٴ峛$(s2, !ZjT,1\\rù66ҒᢘdZ'-AF-nytЛQ!3xC*KZTߜ,M㣕pStqJh7==]Qy(hta$D;A*OENRi4ZmX^R KUJ0XY`.2fRRX)̰M5e*'PElѨ T+;ЕxjLzۤ}CckYI8ҥ=L -p5Ү`Nz%:o0v1%D{!V3bcD&̊,1NT3)i"UVddLRnJo q2tVj2Gk:ViX9^C<,d J V#pK qåB0W?/a҅ ZUhAv(KE2A D)K2 a;ΘY:$0r:3#S\j:dph"S@8K.bmVTt!㡳6 #N¯ds*(p;iRŲ$K9O2TQH)7C! ) ̌)6J:4G 3DDs Cz1okAaC^+ ^d "BLHB IsJA)R 4l%3ʀa\a.ͪcٗHzHF8U*8xF{4\ڛ]wK{`ݎgyR_C^g0 ^SUF~jSf$[P` ,054?ߞQ1/řT&@m:_<%>sw6+vpo5HDNj`Χ3ɳ:+5x']\GxXӘP9ށIzn:! M@ 6%(BjNlpa1pDd!boYl穦T.Rf`R3b+=zF"9Ĵj^H~O_C^E\xQ tL"K Ŭ;Y^*de».`料q$N`]6M$iUqrJ6MrSU6.h^tR 3OB< Jt m+FZW[?Xʘ`e@œٱ*LyU 1M4{}1:*`T]Ƙ^\c):.з AƕI-FC0^1଑c<,u1%~ r#g*rqF`i'>f>l]'ɻMI.,moW)vCHH`† :CV$,R:$C;h88qoHha>i١=Cfhw(]kd5*u$}w0tAޅV:~Mlnj^Ѡ_[njw vRggsttRaD069cm0$ULn7g7IlOOmrnz.J{!>DtWף*vWN_̣ꁱSQik |uDpP`a s8㎧6ONu 9\;x0Jr̄P' :4op?O<"zx@F8#pq@D ED  t/'vѬfCO)b~L8Q/`ӆ0L"{`gjvkZ`{K^ڵҮ͵IޫhRFDd6T^Zf%ujX*X%AYFƙ:vC]qƕ|_C)X0B+ɟU1rUsE;ZJ_PUH/THn rT/YzXQH(?*/LRsL9.]#%ZRR3dԤeZ̫uk.:wұ|l*lИZնuYXu> Aϵ͝}OnI!U.F˽׭Zq~2Ez7<5kcI(0}%E o ?`LRzaCL^Lp+X}B\qnY [$Lef4sC%תwG-*BUnr&;VU:2kZZW@:X-(FRqIoQZWUS/-GE*uIrHTwm#w6oic,o.55 N-K)d#Fr(1tɬ 0Хρ ^5@өUlv/^Ȑf{ ^\ Xq"T^ay/e,U8j_I q!_mZQ Q\ "2%ҧO_M>+5O.%o^W >R$z#AdZ<<L1 j%eGY/1h@lպk*Rjnx*8Qi]Jb =xY\fjn" 7ntџcU[Gc5z:xZg9ZZVx2ǜ1N8QgU ѫ|HnY.>eəF7F6( &`ѡf*.yD:$c`h3S"VDۑ<+b׋22Bhdk'5<˙B;r• %-VBz%Li5/7a9XSCk5r#1%=sX ^}OH¹4dSi2+ukZATڸ%+a'ba#5r|?,r.X=|=ɋ+[BqQzUYzsq;ڬ,.]_YfIjys7z|3i@j5⧐WOVؓW1EugU<ڶ/lUKV "*H2NNH`8GD &wJ5^l-H"Lm+™DъQIUD ٫ȍWH{A[oQĦE'$rJ N ef+eKpFX&-2Z2m.5S$l/*I DJ Hp2h(cUm̡9[ x4xr)˭1XQ`q8"7xJą(s:u)IyNz5QSZLu]E0&ŌeqqP 0ΘnLY=v39纬gOuw45/Ji nn6}'7Ր3fߵi, yEGϤGQV x8Vt j| ;fPv omF+KW35=8_VKH+}0W]F ڃ&.CBu<%"_bUW cbJДvK!K0TCa˘ rV2!˾T 3̵՜)'p֋ lp.jL*b^D1s#T4kE2ըe L2A-,h͋$N К1c{| Vb F 6FCjD8.F1AF*6(I]YE/ K&+x ɀNGtGЫTTEk]d\E(߸WCżں\6Z_̿5)'9!:6 =PVq>0Ta{ #\Zvs5" )naFby܈mR1bwmODK;uU j+;C0QGGZv ܡ,'ά Ty@s].P6z&H ]zΰmTO9b4Bvu=L1ycHiKGxDPvIgM/5? 9͍aW=KLV̽jM!g>7'l ZV2s9dbZGN?1 Rz>dIcT@f+W,K1w74x1;YtXۭRubKJyM/.Y8i[<˜_<#[=alqR #&KnIHqloRXd+VҜ^>t򧫫w_=|{yj#> ww֒kqhk Onb|̢Z֢h '66㝶\V\`i⩞-%W;PL>[G؂&WqݲN%#dz{*JaR!B0E7ʯZ̪ ]Yń_Si:LgCiѢ$I)dA1\RƤ : 7"HbEY_z™%]zw>K=D%y!blxX;1NfSo1W c3/y,5\92Rkp%zECΆL)%RF>ZR̰q0+:QHymykZD*h9S -A;T2sd H\ ;4( ag}/裃VvGf]j ( 9 Z Nd'.}J)&\ %z]@_H\|^ol?gw?}FFw,%/zU[DodzmΆX&2`_Ic~GX\[lı?o^!;ML!3h/L sT~ݩ͊n}uP貺 "*1{!7c%aF^VXMqwVXoh>GIͼUR%E$Ukpn͹J;q*o@;xVJ7g՝f2Q><^VXk)P?䫳–݊*+E5ʓ=Ƴ⡲퇒O{}Uǜn 1p44JhCeJX /U9%~;߼%gGZ 9Sf{B" +,@XO)A tvDE6U%\h6B>s”'87W7s)I}uP貺ы {!7J'yy.pX zn&׳ꃓrߍ}}v//%PupzwkD5뮏l5O5xS9x3UŎ[^{ũlSXssP`` lDHŔ94䷁Dܸ_%3uP.Tr1Dwmqz98hK?eO`} Mڌ,E3,{f4깴lmI`x"Y 2i 1g,.!GWW CElva)뇊֜R}.)v' <j+±1emk13dN!:i,!ਸ਼ Q@S X!!P r@pP6KcO. o)T~WԀ1u]9aEԂTVPfFZ>6Tz{2 tQl%on0b&|4X{í-{e< {Zj#ytGurm;MMV 7DC$'d}RR8!ETRb87:͛1'wLTD>͔2p!1׷׭,`nf-_75efI@/fþxV3_~}gQ>].S`Ga*I^KS0dɻMDH$>JyJ+dq"XZ2+P:RLe`@dOn|`C@2, cخ, CәaF2ҺU yڰSUqWcڝ[sca .s[6YRuWTADz8Dz^Ka']$j嫟d*q GZlÐ"yxd|;ژ/nL 2 s]bY̦hdї-hҝ܇p9 wAhc*g#eo.כ+ֺ> ,Ye"T@W1je}tNz@[KOicA'gp(*yKDH;x "ta>vg[)cYw1oX31ǭdb4F(á 5Naki >)Nt?^h ,a UB]~Cjwf{=cQH'7xg"Om4F 4k.V߃Nl%w^ 8rVO!w#WD h8D'L#zzqy# `8D#zzn(% 4MF:FM9$aapf%7+ʝ0 w&㠒嬮``fRnX Yz1AE*#*#Zre4r TzRI@0HTT Fz])^)*>&y1_x)hFRT%ꀏzr8T;ffE-L@lf{,#ٙ>*y%I} 0{߷}2q'[R[JOI6qT( d)jNˤ (!VZhSb5QL+)l )&B~?D1ḁ<ӽ8a2E-I0#zn*'8߄Rikr lߒ@]R,NAҚh4qYǸ)ZfXLh3m5PJ'ӄ Y+Ij)8(FN|pF\p`Y?)FL^ '5-: G *oj1G)ep9,䙛hM[rtk`S+z#t Uo_OrQeϻ@QqK=#m=rb> ~e4w8ehZ#]]Y>V#(R9F!9`$nJa!_?z0RD!aD-֎r/ABDΑ349[swN>xv80b'U!dj1BfO;;}M4ɦZ'y7$rRub:mxb?)`0,hwa!Dl;|߻옱[&6bU,[[ y&dSnFK%Lc a 0FbH!hi(k!| 3>fOrIYZ->:=S7 }n6 qn;ExQn5> #*`^|bNiZWr@o 4}Zv58ꆏ a$U0&Lvͤ{}s$?|V q߷ͤ8Y.r#|V=?T?"7iMZvZ>Ƹ4UψSZ4P011!@b-H 1Cɿi-|o3/g' pʟ1T\gM,>Nn|:`3MC==YtoO/oͭ wah&|4f| ZL3!ć}XC$)=>X" 8TCEȅVuU8u&zg$6~^a,H9$[f5cl;nk~~qy4?eU$~]eiK~?߷ /Nߢo?m=0_~p?/|PׯN@z` 'mf_k~~__ng3?5Zs|wOKP旒^\+)ht{5Wp 70|Ș]i(k%Hx/P:j7GG_1xy(PtDd*7o2n9?{+ɘ#غVfGT B.(nfz{3}7o~y;xzg^\a(筟(Qgf"+$xOJ_H/TnI-Z.K@]~jP]R-X3E(oOqxW(?̮9WdfWQe*YVi L;K! >Al8awwxoܺA{ˆ Y6g, dRKXӰAƓGc;M#.ci!_Ja$N0B2yzEtͼX{p:M/0ǼN`\i_,/ K$J9ejB08^Z5[|E(>rJ* M7C|bH9# 1bF*VP#@ã/PD)Lk 0)K<ҫ28uk+K. "u0$Hq`k-T 9Lѫ@MCt_Hx4E;IqH%|K V:6GJZ- 0(VLM S'H'yӛ=NY]#+9Q;pQ)ٔUqDZfZDrAsƔE!XrBPYklĔs];q 4w8.y%/K9_Zp4xpIIb-[=RvUb1j1qZLQ-!j{J)tL(-o)骜"pq.5sL&tz\Th1F +VDӃx!JVD9dɉDrP,EhFq !4{AʡlxHHSO}"_y%}Y/FR0Eg$B٠ cՒ[1*ǒ0NžE0}-5HBl0J` o^ng oRC?N{V9I6^s9kxhc iYߚY׸P1瘟w9J,N 91 0Jjl@H0F6g*D4DTj=N3 BV8[;oK5f r0p`>FGwUM%,_w*ow{S OA#sj1MS)M簐gn)6g}&Ru7FTI񢯆M W!U,䙛hM nj.Rub:mxWi6wK4Ի尐gn;6m5@̴agR| L65`ٴ?uh7lgJE9פ: EE4ATR` !6({;_ n1TRkhP N)`ͣ JqN;z%!cˉ)QZָIVCqT d-Xs&Qⱽd~LÜE깿HT $bќR;x5_T>1ZwaŨ(y1ޑ /5W>\BT۰k R}\SFa6:~t1 릦r _^-Y?04 Dci63e{ DI^i+2!Z9iTȠscV&y|4Lb" 8BUewщaIpĴHXw)ۚ4 A ;lf~#M w1iRxʯ.b8#>>a%{tv^{xb}I/b}I/KzM?v=o#yQhv}BhV;|Q};!Vn)Gzo/~ƀK>^f.&vrd <6>w_Thp%r`bB>) =8 wr D4hSq2%n^29z՗+e[}YIÏcS<%a1ଠHm/-(IVmM, 0e%T7VxIUJ 1KjZ°&]R^%7xtIqDh@l6rrQ}IV̺FI.,ɋ z jӳ~[R9L e#X3)Bh`x-rZ2P(, +1 FR'Q. ޣQ'j2EZw@f!|4Ñ)\Nx eiYFTzBE$g x>sSL;JvfT eTwj[TU{}іQ!wPqZp v p >8,Y'T@W1jeۯNA&.!H;SZ72j[c74 {7ʊ)yd 8 b/9gD0^/^NuGx}U.rQ崜v‡/q[!m}wҪnFuWgJpݗ=w1]D9—'N;K^nu'-g?{Wƍ nwC/Ud%-HHqR )|3g!MeY|Ah40OcCtksᔶD HҲ+;qښpZ鳫+Fo1ײBR:rQ>T\=%iWA4 W !䛷o#O 97d:7`Kvq u֏O @z~?3gʩCbjxȏ!*௭L݇Yt:y/yHMHD<$IOhbbs Y6@ Ulټތ"Æ#iYMH )!Iuz 2C})6A3ԡu6QzxE@ŹR_MܠCCl8(0h{P7nLdbc&%Ark41kБ G` =f+99 pxZfn|n|O>~~gx8g~3?)@>If ~{:FVǛ`L/T-%1vI(-e1c-6cD %o-G~'S!ZmW`*v "v}xMt`yș)2c!p;D1"c/51+% Jq@?g Xeo׻/90#$BCM𸫾yؐgN|!T̊6s#$asִ( ))Ž2_ 2n}<&ir^__lV>ɩwUS#Aj189<<6ud$D_f=xe(n xDє0:QdXw!&0fl"Ug5=V(uU7o_BڷU)i?x D3Jd >|D3-& Ăm|u6Fp; W+ũv( H1%YC\AgU|}Oah,p"mro/R/ ph·bU J4u4t!R e^7PJ D"NQ aIN9psK19)ɄԐJ-!TF*lm 1 &#agY%;$t=gj27UG9ug9L,4π C499Թ^$"sD)XDCJ6GQ7wE_=Ot^B[ns>Wt=M|k9ȊL8s./: [ܔ4XnQ6%E=GZH;S$k ֺxBˣiTEW 4, !諍QWXY bQ%^9M+{͒J MMGؚݾ /Q.z]I}P񜤜R]Ix2]]Ibd+"iWI[v#M"ٕ$Fή$ڛYuVP rdCOzّb%C6^ǀ3*ZqU2xSGUr$NX5WWVWnV|-U (įEߨH^*~q}kH vq{Z>'{8hahݩqJ$g88I)qr>;h;WLZ7=ú6M%4IZ*ͯt]jyb8m.{i.w$jshZ  i:ZF@ d$¢޾_\u*tZp/j5Lk3fo]~s+s6i {8;p^{ý{z22b#X2*$!aYSbrEe90S$D+0cRO^ޕ?6)}01FE0q j=7\TV:׫!G&AA9̙uXj+o޾M @sy{nەF/0@ R 7 5 3k.؟e|U̯MOF͏s:b\؏LOfQ#g+ |is\GE`\i+41Cf4O=_D7鿈АESg.n7nOʯO_ Ѥps쐋YMwlЯrj{)/Gm)\!B#.by.E&r=0S腸xS~9N'[pƒ $/ Eخ[~Q3 'S$l} cJd; B6 C m s;^9br o@2j@MrfL#5XF,en-bU\a˩#FQקn[}Gq{?U =$k[mP) @A)Kt;@T!1 )h9јx-, JNN A%T(%hEMgO^{d8i]z1;O[xkyyg~y&zϼ/Zb.>3Oso)E/0#`J^?p޾|~bZ\)ڿѽu#/}e&=9w{wAy7Νξ^'*B?<=<n[gI_JvƹI$D_E1oOȷ<ǽf2@3Dk3)5lj[x0\sKn{CJ#gpTLnP ;%baKKaҝCbt##o҃%DKf}? CSO>Ч P[Mj[Av! 2m3,FT}]1i P7buϠj~5L>wAZNsD)9RbQiMJASI k"s_Inj¦rӯZ$ e=Mt:2z (L$#zC.*dž:TFb0ehWىU2,8(%Jh+xocU1 I*wkofQA "jRX~BjcqX.B@N3Y#Fc@[)=s !r@Vn7*?^%ƻ'cT[@lC/qUZq5@}4GҨxLbPu^;dˁYhd`1l<1n/T9G?"=X5sX]1PZ>yz9zz[uTջqmF[x,Y$IBQ+, $5@L5?K^,x_MOFp\nc S]o1ŵnɂ{ e@ۼg֟(mm>cmFS&vPvh?D9/jdasjuod XN5wR˓}0lCn?.9TTe4Ϲ"LI\06n3s6|n]O~~lݶ3罉eXw1 =|Lo|3zT\v{"ttpof ƾS S  N(* @; (vU_-@qpPm/EUE# "(,X-I6:V"(9kDs_f9l&C1%[P\X(\]ux``)CmZ8A$<27jΥ`';,PAԆ4"rKG&Z@5J$HR]/9AT~T'\|攝L.Yl8=}MؙN7!ؿ1|'Qi;|}?c3ЙgW;Zzz46WOSkҧXπ"w-GG3+uh |ca.4JH& 0ɠUZCl^:r}Z.b!P=n|+,>EUGs\ӄ#s!c0 ' ,j v!hz9Mdsݛq[.C݋sivzik C|n#A߱ ,hyPqVV-gJN P(~?,Q%kY^rYô" 䖋aI%˰E Eʹs4Xpthgrc%{3 Qa~'6wa rM[YK3.Dw,WYӝjs .@f(I[պeH$cyOF3W=Ϝ;k6\Tq縒5}曩}ggS iHIk^oP.AԽ~{\}x=[36BL|ݝO"~O^̯޼Gb3űKޓdWLД>03IM&Qlk-ɢdWDIRO;0l6[zիw:6YLՇ|fц}o9mwg7їeRsxq́tpS*p[n̬-0 Y41G H$v9_= "MnsPW?}ԟ,V[`  f\p{1%zIp+_^>潍>Btḟ~8z1ՎOg-fd_+ l]R mQpzz9}}=q=r1[.}|&bmZllǎz,a0MhDjgecٴ0+V-s{v.a,Vgf; V4S°Ed5:-<]xKHpvs猩Z 6!C(B rxUbbXQ>HQ4"bEή~Vrb8[aH Pn@Ė5tF} Ap28X:^B7 JPqE%)b\W6:XSלzt׉\3o®)c'?G_p,m${WMs3k:tsJ$|uSjfoZ~n?? 3/lΊ55CIz/4W;<-Ǔtߕol ov:*&GO}gs"g&Ĺm}ܕJL9cp- dE}866# Tw &=B(v3NacSwyC^ﲤ( #j5dh^ |Dg?mM呒)c<Lih&IpcAXD" U18$ ؈xRutاr-Ҁld(n<63vZM"ų -l>3o"UO@"-+r y Q|KHntZz S6v&UL];+nWIF̖ZݚMD0q&BᢱU漮]mξx,fƾ͎G+:W:=;oQiDF=+/sB{ӳ- :?~#V8ۑ7}eaq0ĮLu5۲y2r` #` x[֝ZtxQ>/pH(wˍP FbQvߍPguB;M-JB-lh R>K,ɾ _m\>߅y->p0uF~v E~meeq(u@j5޽!=#pgC{(#jrܸ&VM&7J9*d |uV/XGoW_[F&{ViB9Wj`yjRwU&XH[ڧOpܒ83`t pQIw7ÝZ(؈T9nSW?.dW/{HqD:H%Iq+>>;it/ s C.KwKmJTa J4FE*&<4"E$)*& o"E6t֜M 5n+/ڕx]I>>"ZHA{?Ka)⒤Z*p (W+ h#D 2ˆƩQB7E_Qgo`DW*var:ĩf״ 7M҈a4a7oN<1HPu?&6$bD6OBe& aD$Ib`FBnF6X1iRnS1)ƚ"#-MTa$x:OMH0+[֠id(c8yHa"f)傉K@^etLQ'ɧ&#T`q$"FThcaR z3@EI-(dMQq_^ M1”DbCń C'D(4pT,VXF>//gd|ŖV~޽oW2nu4\:znV5R#Jh1j Vv#FY-P\*nyNϦ8~q>l  >ע{gP 0P>'ʁ_d3V< uF5 cebSEI!(H .cJS:q[Q\3]BMd{Mv{j]jmS [hUV.'墼R\p0t/_͆z>لϦ;ϸzӦĚ`F 08%‰LyLS"D3,Xb[#ppzOP^FmW<1HBP2 b_qVUvu}_@{n&U"G@;olې9ڛm8`#$;u4[FI4` $BV&n+?ޏf,0wX#)$QgO_dn;;Ϣk5i qzسoEAzw5_95B;9jֳ`[ǐ8cQجY|uN]XVXY!5]4ƞuұi|XkUk:"PC8PWٞNLSC MY#3vJCȻEmZ];j="{b͉3jqC}i#}vSlOiXjf!H8a^c/nuV#)4*Sp6v?ٲh*G$v>}FBy,$(>5G27@ 9@lq=< idq #rȤYF- }5 T$(H1pӯ`; }Pvcmdm-k½S82Yj Ps;>5KBGXjЀ;>{Uaf]sM\Bds8m7]] ɤK*hK.'F殐BRBr9xKߦ'A>ͬizRN0ÏCiǏ yP=@2m8X`cmo'8 & \26 eEpK0'c{KBL/݈[lQMSȦ#SB}Wھeq" g/1( JK Wg}C $"nouecE=BQB{{_k>j|X?>֬#jf\֝h9yYbLuʨQҭh{j[6uX &=eua|<[8dz0g_t>=„#}]E/< ?ὊMRY9lL"zќ4G#]&kgv*_ʜՖU:qv<7MC/`TM{+ ΄ѹMH)$.Tp-"u]”Q?Uɒs4LOi+|2Ib|TN ۬s9eǐ9oy3i˽oޓ1,[~>w&TUTh k&Ck` :0Pd5w ˯?}.-)ϓ!!Xp ^60Sm"]29;~w>O#$V>wK"i~(ANJAZVt lz{ғ9w ~EWBj}qEuh?kY0K0_m4'G&õKOTB!).g㮌oVь xqNGA?Ehƌ < 68!訮b1B ] `1YIj -{5KKEI}iEa\vǴ`j)~읓W81,)cIf``6|_a}}>DqtKHvA(la|4{F~bfAv 4hڜy*;MT_H%~s5Q#Dä^cNbZc6t O#VM̼{p=r&_#"_ʺҧ>kIԮv("|ڳф(h* hBӽ<2D)>Ɲ@hH^X?5$D >鯣71\21I)ثId/lQ\QLvJ~\v3_ ʑyq$Д+}j4SJIYs%#6oJrmN#"o>K/ ̲ .37`TZYQ d=&(.QPJ2IVqⅳ而Č sBеR],αZ Ryr| \X _,EgsYNU`9ȹ(O Zr^!:Ng Ҍ՘; 9`6 K}/\I"bt I{r򮀄ZWWI]LbAژT?[?Rr%6XӶ4m)G4g)!,0lq>pbXh5Wc)#ų˹g8Uͫ/XaUTgxlm8﷐wOL4\_, y >i 4#woOEWK][h&5ei0҄PQRnfdnT#rf;3_۞Y舯w^ݥv:iXlQduJpvZ(XC%"rߟG!ծ6#,H -%X4;H?'89>d [jfq4G \pM@3DYr~6}Ү UUWexV=dӢ#Y!D6BK"! >@I*EGJ3}+k,(c|ןfagPr[Ckyp ?}l5h XNj yQ70}jlA&Rq>t\[C 74x=͝_n7=''pճf7W:nou`͟%v`%;~}}Y@(a|_ Ao1x`f\U|ǹx-H[>dݑJ'͡YEa[^?抰6LmW 7z}C/G>~n41d7!"\D Θ8Gݫq>ڄo1贫| 4L?ߌ8h:f"/˩m&b<[8C˒ͯa ԃ='sgxHK9\FvDx5P.ǡ˃ EwQH`u6Ef}@*-P+uJ\su㪍#dIdq[dES] L(#-R3q@H>qI{WvO"bA+fiϵK/# ʃ(ٺ/ˉMű**Su3I]ϸ[ ޜs~$<l Z_X"yEkao ˱zd{P#mL+7~Mj*%bRc);2O}wb纆>Rh)ua8TX"Ra6H樒>%8p mG~ZՀRy#!ϐɇKcpu[ւ}iB|B [e>Pߍ| %sOT|Q3k Fw.$S l£y$foql\N5=r<;c޴nOʝ{z,B@5G)Lm +@V05aރ^`U#PѱV)7_#q p۸<9~ox[M=?{RmzsW* Þ!H BFQ`2cjb G$E" 6?XrZ}lЉ>re%0g dY̽ΖwݏlvJ p7SôNs)Yx_3r8*Bly QZyXpP x8Ypv㘱(dU 2J@\q5#B:>RǞ!"> |Db2~{Z{`1rg2\y>26POL(0}ĕi\ eI{/Du.&YM wq:Y[7J`y6 3/mC , ͐ІP(=΅T*,R#0:h)Nhtas~WOrEqD@B!;3zez<(*? o&v+ThMq:݇}M:La{u0 Ks%Lm1*DD[+BOhؤ<U̗[b/vUop-pq`ݝSbmOp{kaͰ섁=cpYuN\.ͩSY8V0W|fQuN<-5~هZb} 8(ɫiC[xeoU`bdu"v}v4K=-aGpu!?ԦMߋdx-+ u߱5R%qt$W~K-,}k-L=Fgyc3Wk#gzdwGX4B˗kJ %vZ7{StI ZE74#"1u&KqѢA>IֈaRI|*I* lF$W:y[<'䚻c0a{i0XS!5I Ԥ$Ԥ/TAɐ$ (@nH($S4C[Uq/B(AJ@!@[ŐhuQ]dtIdvĉ%[V$2`[-B vo޼iFT%sι 8ߧrm,ʔV9 1'" @D DkH3JAPϧ4FT!V~@CDBpEldo]~K=.Ve4++ 3Zg"7'#%^;y!ȑ"am:S+Uڡ aP#5a!/=UQO-{Xod"%(Yr;ӹˋZk_-~Q%(ƘDjI/z՚e̋3|o^< !D7aMV)>4M1GpffCafFYUaZP^<B$/Z-QhӆڋS(Rg',_0uOK1op8B5wˬRi0 K0l)!})BXDTPŹaO $Q$:BO8}f |0P?T2" KHDf!Dkΰ4 }IIATIVu= Jb!Q+ccI ' ÎyPvc]4nqor,MklY{)!}0\*7xTh-~7^>$;Xb]/_sM)ʊ]ub gz:_!27wr? l'`rcBm(; VtHRvŲöȣ]_UWWuגCm‚"AT J,p>&)G8KZ]MF;RAaEhN#hPܠ%aQJH=*eQTPt?U1Cd' wy- &8e)^D͔iDDS`0*2!gAn<35JDhJ:r?LSԟ䕺nCT )h?l>N?()FmR[I A4k@5L5b*-D"WH-3t16tZcIGimE{#! 4vzBN4X޻lp+ (p\ Rs%,L)若=2c2GFxɇ;DSA.9&r&8y$cR{)HIqdlwVRj+MU) UkbΗ̚h4:!͖%zeƂ5 !lX"~ emga",E" x2Ԃ8,tK׊bP.)Yz+DVE Q8Fm0= p(vZTZ .Jq(`›]r^Sґ@ΒGc\,mJQ3|1 -jI0Dl5{,8#a*' GR!)CP ,b9&ʑ Qk5?A,`4ׂq"4ֺ7 b`0KTapE00:IB$? $ɗs?*2$ Oj>T5{v_7Wk4V[WXg%g#F$ se8)hf1$(B$YKVc=x1kwlO/>,P|oggU$ޔa6{u0t?CK}_MbuW~vק~r~1O:7T1H #7f))xA˙'^pT|y|JhLF`b+YzmUv"f*9t6*{PA 7]Lhʮ!!_ɔ@7i7L([)9S:F6k8zJefid'˧{עۧ?G1V'f"(J%($\)'19DJR2Rk-i&MI19!TrqW.}> EW䑷I , ,b !X5bLGgHZf8쑍\ﲷ9-,b|dZ!JF?)#JǙyaSw5cM*Ʀ>jz9Va0[kљ97{F=g铵jW]6~K ۚ.Y8[ ~|bN>-z0" [\-w"ElW3 rf:0GWeN(%|gD6N2+J"`UȒI’{dTIfPX0czŪGc5E,iD*`'>KYRs("`IU4az+x5|H߾Y)UX]'UX^JMbDZgDc2O4AC2 5n`ߛmcdu$r4A o cm͋F@15C q"@xꆕ uuӺN-7^t~\./l?nN2avq9tmcEߌf KaՌMGUx%V^7'$-9o~9ѩH YlGT|r글 &$3G9(GZS Ϊ&Q㢯ԊZ$iqwʸzxWŃy(bS 1wk=;ot\>r4%[h!k6t6SoԽA WfM;Okw_!Nw%pYֺYpET (+0@FA7l`RqCѪ`4T̳/ͳӽk<_򖹹-i*/\ɔF/rRr J11WÕFծ[1ڭ|"Z$ST=QXdmc+xy[)ٶ[ ED(W6|Xig k "&F"fqb"QD,fZAxQ;PB6o#,p6 kc%D6- K4z y{oֲQg,R8FoUϹjuU/\Dd$@8k/ GtJh7VLhv!!_ɔ 8k7GqvQA蔎F EA݊ n5$ "b{{O_&L&k5\1PDϐk%"lWA<ߚcMSϗm򛄌D&@xͧJh9euTN*@#Wc7=Dmnw`iT7N{L"<\Uc5SpN#Y s;A) ZTUC 'JEb{*UO޼ǫ4y1 ج}4~QPoq" cy 0yHzf}௦?UyLhdJFs@ w#Q%Xl ^߶-$hͣ ;S5֢+"cHǓB?40r*-.&5ղ^,mZKPq%>oL]Q܈hժ|}{T %M>;@'}IG/d!-/ҠëSX?N = ;A-:r<#?RA#-^S^N.@1I/e!lLpך0T X|tQ0w/џ/'oӉ gHV~ufQ:;|7"fOQz'[-bf[|wWۂ慚9xٰ޽~12aXbּ`90UgDt&/|x x'qy8 LXZsKk.od<[Zs9Z59 K)0.Al<~)8/IZ(sJ(L˾_՝IraSW@mִv3k DT$k4W/dnNtw.-F[Z\qμnvV+%Y8[Bیr?yD"{bYS^YzޙHj\?O0¼]@Cv1||$TP0*:tc7j#U th Sa=#X09n[]p5wo wx%@GsŜ{?si5\5o?{WY1S`ZaQݳVYMYOIoFTE|Sس_gԞ+8ڒ<7RIL-9:;( :wLggz9_!bEv#݁;Cu5GTHJ^oSMJlMMRΥN:9f ڕM4% A@Gϸ`S)BQP^AWaTy$ESg9nT[n)KtHg@ԗK.`ӗTr53Ѓct9/Q !x*λm,gv.|zJcu~R|HKh"n\9ڜF ܕG?%(FUQJ#)"cÕc=&QQQ[ SZ#%$UہZՠj;ܬ_lehGIю%E;*ڍj0fBz.3ʙ:^jYpDMB"ZˁUTk9wdfna:Th;I^Bit:z# se*DFI˅m(:^ڸ?y1OaIݡLhCLȫpjUKYlw"9N[(WkC ade2((Q^:t`PȥR`6Z&<QRSEF"ǰڗA@~&Fk4g#) &>?:Աn4W8?fZ,AS::-l>Cz */.MiFԐ^PNQ@-⻉Ԅ29T_(T},/O3D߽$5Rh&@WY0"9 d, ) עTӂO m(6#3mn$ c~.,Wɯ 5"!K'yۈ@:r=o5DBVZ1dIq>>*g:_LG{ן P tt~Cū=JzA>$6:*ҕk$r r$$~'(F^CTaJ {Z( pv Rl!H4.RINF}an]i"%ds_P/RJWz^K4H.97ߏvTQsλV::R_6b{ k#A1qS(fxvc!)gRKhIĞGu<^˄=O/- Hm_IH 9Np%LCڻ;Q;QN D2B`1/0As G=x8\"OfHzSrgPv2dہZ%C|;D&3!N1ϴW]mۖHa!"wY"bԞIa3(h1)#/AP eUWs>0Opo߬V` w-iȱ PrW /L.e %HZQj (19zy M!CJm]Y4-FhOef0 3q4j|$mdq Gx/ *QX7%i/fbzs}Mt5Iŷ |HBicL`@FEKXEGn+Sג:mKRBC%1׵Y+?}>AT,fEWUH2X)V<BYaJ3.3@qgsrŗ n5kpwdR~L`O-A.w0{}me@ d ~v:_ܛm;x?Bpy[qQ6;BVuٰS}17(Y䛯h{kbp1/dX믾cqK8?8VVgd#{Ƹ`P2B f׻̮:0[}pK*U ǻ%ya8n{O/,O+m4xVFZKONipvDV^Rn=vrnx SyaīҰ铓~(+z5=t>uF?$kN<&{ 'D3]UԻӥk氀F'k^4nIyDdy䄭K$|SBj!)zuLaYn+Ѿ#H?|SK8$i Y=EڃsVNhݺ Fənz{`MðڰWN R PnДZl` 6ْdeBTmf9Gm9`jD[38B1.8%ČE㱃1LEAqiR$i-2%l(>w˲X] .|>Yw-u*4/~lZJrd)LJڕ1W!O!a6p5wjwi/ſ&n0M*wZcv_/m -?h;riA7~Gĉ z)3R1}~^3V5ߓ(cWAWꦽ]r:SXTH ECp=c6וm4qYU˛7}O<3瑑!\d1/wI0I^md}$]puhfד^ }:\`7~C*X%U>哏I3nfa^J3@vܻ²GEBWcPqx)fc0Rr>O8-@c.7ט!FISI ~S*Y?f[jb ;}h25Iz&-HfM + 4`8Lɭ}\>ISAY3I.̵yXL 70 YA4T1Q>=ǟ>i.+ך0*`:P8"=ϔ w/Xl7T\@̄=nzj,ݚ|K48aYlOjcsSi*⧂j5qWgshOV™-L zty6N' V|}m\)XG>#0Hz5SL!AS WہoDTitkFܷ6Gц +Gi4LbgDZ:߅Ԑv%n[ă1g .-qii!O;ynNV]}`F oXB4Z&HTN(z1CF$[#%SȕT B`R[up:( -фDBU ʹ7FF1SHfm9:#rU\i7ݪɘeY^=RLץ"ϔbϫ"dR|u`A=g({'!$[ d=:LD7ݳqViFJN-y߹ c !PR{ 9_ lByϳڛC('L:줹ӫ<ո z87in9NQ4y+]" u?4 ELp18t~hb޸O뀊.}kde^]i0/x;ՙK))zo|do4@ R: d]Izg2@:ΓgjTnNKڲ~눔w;܇׆4.m<{f]䂃5RPGU[G{D!*u;abaիU׷wq#Y(o~m~3t5Ocaf0('*FJDQJӢ-2$iYR4d[IJ [ޞ|%UQoCN_iϧܾ !VpGyN7*q9S9$#; vD("K'XJXYGpQu[=˼fh(4Ü>ع$X BA b"R⃓8][o7+_vy?<2 ,6k#9nk#S}Jjȥ*Ï8sμgP8 ":@\8H=^矟-LMk5WjHwaοt}j\FnF~'1O+zwl85pF3NH=ᮍJv)i0sO`RBG ´u(Totܠٺ9MF6y) vw9!@q:D5iņf6^Qm(wmbΘ- %|SU>:T2I\pI%^2+<$40P!0Y&QB(Jp$D'i|EogΊf\41FBō#@d,l 1B#n:\>ÏxmkPj+"vKLpyn+C &[K:}%nHs#)h NS0XAʹ5YCMBbS& ޤrZ]:48ˆVgMVV ΀APՆ|T{+@I?Vsێ5#,6'+tm1bsJbPe5(&*`A."|%K ,O[)g--ϝl ʤ+7y ZLy !J3zʋ?v 6G`#~zkͶ4{ϷjN1!{I=>ۉ6%n]9q(DAq,sή8QL.M9ywgyn2yxFD8E={;`"8)PǂK:@9pNDFA1{O+w@/s,sw‹LSܰ)8Y5]ᨙ PA?|A}wA̎YX?Y3sc"fB)vt[Q'Kn-bZbvQbiW"7Q;Ws%NjYAzC V VM(Y}x55R QT™Dw.J M(cN k_qߺ,d"^yK*vԆ ]$y~De>b+g)/经ҨXec7*".Ƅ+)#O—Qs"KB 4hup7)\ 0L2u&x!" gM~QNt M Zr"{޲!q;{~/{(H$J\ >zJ-`pXqFQ'Ġ[ˋ|yx[A R@XB-\G\1}"J&Ԅ2ȇ`ԃ~878w¡ |H{Дv)\4qP++]"Cpg. OoW)4AXrJِp gLF6K=.OU u :Zu`B3ʏkW͖N.7; Ÿ^q=}2tZ<;=]`.~_װ~y nyhF›:;~qY?.wBm.Ʒ)+[ťuOfjʎa{@$CaIC܊)q3e [/4Ȳ\x go4jTԻ_4JofNXEdi_~᠗s?'C9 8iϜq2{[KN1)&6elNO>]~dcnGnm@/+>ُ7n,GPcYITSX<TѧSufVfw?|o9%Lyv3eO>ݖ8k:iX xmlNlœd٥vcXb9}xք㇍OǏ'^2d[}4T?w7W4mrC_nNCro< F_"qjLGr#v %SnCuۯיghB>ބg@;c :3ؾ| =/؜83xx᥼MpS<1 jjmA=;:Qe§;[{4\Nוi:׏cqvRi,zIӆ+)mOoaWOî>_gK&ypԜ;eueSt{'m%_OΝ>Tgm^E:UBgX5~-x#^v? Qato06dpVȀc(`H=1-]Pí)c DB0chpp6^$2#TsvRI驴(t*Ea9$w.KR܎QEƉ(&㴇v.C~zӾ IF Qgw̱S=O8ӌct!K-ۧ*K#x}b]Vt',Rgy1ڽ|ߓԠW/uub658ivɹxN5 :C F&wPTu,չ1#yo๞Z2:BTZGZCuBZe_ejа쫬@v]PEMunZQ^5Q|I/&Zn(#ۣ󺰇C'V|zbS~Vʪu6?\=<ҳD6{6!6x:轔FN01\j j7L^an{0P-V{8Y)_g}ޣ@O0ey4eLЗǏE9 &tz_}Xz.R# |Zq`H"*.E|VzbĹ"xI9E͉ItoK12^"|L3-w̑|Zq.Du6^aYYMzHqtvY*KAI1of?E2bȌɞ"Jۍe{9bz "0Ͽb7.C[QVl*%Tg+E k#ϒΉEɳzɈ6#32hd>[]"ƌ8Gg1Z2 SXK Va5ꖒrMIrgʻ_N9ac)g. *+p #0J}@@$U)ANO3Ti1x-rӀE!ۖ uJL4;=qORi}S]Zg[ lc{62y Һyy=1E)I9ɞ7ܜ/`@oxv7GA7vPP{&!9k^jW<]Фrdha0gBEb@rhf/wfR6;sjH9\g2z#0Й1׬Q2,zpz[N,o`0l20WRr]H3M VIͲ `?.J?l.߼~囦C}kv.'$N7 SMlOK QފXY/^c8CUCuq񈕎Q `E@Ir-D3v;ȠZA KTٺzʞdc :;MTC+k# TsHgyF:ON0#+Yj@j=S%BQXj[eg֧s97'>ՒVSΎצ#Ť`tjՀ\"NWd> 00tЗ p&ڥƓP?gߥaw7_b4%?WwUE"Ð{,(O z.aZt7 i ?I f1 98~or{aVat!~H3,RxU$Xwhr1#:hbv%i&iɺkZ6$䙋hLo1joxh-:{&mw:gZnM.5[E4K֍ݺg0d/"L jh,cx"6i E4G{gUh[.uD'M*SVhOK-QukCBfE{_-oݪvhr1#:hbvWM<۫*䷢״nmH3RxHA1|uJ$R=-I]߭ y"%S}}F^D*HeޣQtgUkL[g.Y2Ukުu#BbPGtrĺx4hMֆj)Ȕ7ӄOUֺZ@)9%UZF$]e kO|ee2λZWYkS(*kTJUֺZ\=F]e 1ﲲ椫uv5!}x5&]eeML^e:w*k]eL{ZDӮUN<MuV5ëq) ]e9T FчWYDUԄ:ʚ;UZ?ʚFUZ=ʚfw֪&@px5IWY*kjf*k *ZWYkQ@;C]e \胫1,ɮ*k,=t*k -N*kϡ0ÇwfaUWY*kj*klS^s1Yc*UֺZ>#r]]eP+kՓ)!^y*t ه*LKI^]g~]' L o*Bl},byLE 6=639ڣ孏"ӣ:W8Z{Ń n8N/"do\tM/qֿ_T-_y;g"ORK<>#.T_~ Y낝~Xi|?~e>QIaꚎk-cSB+K3S}dJ:/.dB*-J>#-x}iuHiڅ@LF!R;Y˃BTG)0㢒H<) 4YY/x{d#*x&L ŢEHQA` fB+B!BLc{yq" .B@EiaPu] <@A}&D>44&_c g+d!b8zG3`PA2ɨ1VmBOdԑbL4R (rMsX f`Ā#,3!4j,X᥆w"!spT+EZ[Nf߷&d@V+‘#V69ۈS<`KF*' yS9qҚҁ. pCHpoONztL\<<+K B?4gѨ+v4Zr dC0>ïmK4gj)]4 nr%WH&GGR߇݅@OI_I 4H5;W~Q;U?;ֹrںv#*Qq.L^z%דq{"s4"r{))~\~J9eidO:+Q^xwPWlCw& '8"y&XAb:ٙ:8WlΝ}y m Ajή4DVʊ;j|ӻTSDEeJIj^tu3L".Eޢv~48\m pq >5V_=3"V!;rPȓLШ .X 𢒚Pe1~XXBʒp1Ga*E7O4(LOf~fqRR`E*+1oMmC52ʌ&eR0JTuig:w.4w)ahá(Oϊ'F3([o=Y2I#iia ^ܴzӅd{޵G̻:`: _U^*_A'N;7!XCN eiB fQ@(㈪rF@aË^Q C0V3 ǽ5"%\$!P*Oۥ^a.zpݩaY_}:ML.b_lO~'cz//s,KG߼JNO>߫|`Z^t_+{Gî#Sx"t4痗]/-zY[Υ>x D#|"JY,0?7..DJ8D3ҋ~&I"tq:x'XK"F2hNaQ.4r$X&,d#kL Y5T ]:4e,_pQ KjQ,"N-rJ j8 b4Q;0BDl4LSGD䤜0X^:B={;1ʂ@ ߊ`T52=]~r-q[5MhתZ{s`՞Okik/ӛwr ȷ*s{qLNR`W^;galך[z~1(~Vo]x}9?/??>> Ҝ!VaHß:&KA]b+7GXֹWxvVy =qDB'z8pg<*ү sR-5\ ^zy8l͍(,_v58 ;W4DΓlUYIV `K[MoӕDr>Ԫ&jXgM -3av}i\8^IzI)6ƈCcfBxkyt!8}/\>%jM]ϫ?ͼ73s7C4I!2_LxQHV':Ϯ"+oNot:<]atRqD'\=[&*4ax#ʼn+vIx;-+!+>^)uۯUiw$Oez#b \/eQUd}y.x',/.9; e%qT{eX?x/;8=:c[i {ehY?_y0f%BmlDgEȖtx|VbC|V9uc7:{=ʌv69H cz6^b=>3Z :޳ E#5 xz3oŵ bkt9 M8jIT>"΂5Q'!)roqmAٜiw듵~-6r}Hk^9eW4=87=@(=W&^1o1o'C"l5r5I=;$i=̥ЛGYOMMO,F Le0} g>$+M?;hV\IuIHa?k{>np8iS>2f{hb\MqEMm:2jcŖn_o RIN:k=};G6v ww} I' NrP9j9:ҴZMVƄX6ʥT8j$A*%{6_H6 #ǫܓ\(P~"> b,,bK`f%>Pa"1R0 ge juPkpJSPnuYKW(l٘g&`w"̍HIi V܃"Z?Zd3bdynajb"Jwe/qmeнʐfMMF+LŽf^SV|+*DuM{(lG,^Nt/'ch>=L`j{ַǵGA->u>LFKpF61JfnQ2 * [>xINأ]u].}cL"gK]IdϧVFHk6:*{uY=ioFEEK^` ; KB[RKӍyEI6EQHJvGےWwU| )4w`sF&p,5scBy!IFR;Rഇt)XOGb&Kmx. ڟ(l=ԟn;1mvE7X!5rOwH2$C*hZS 9Ũ C9on+֌1pT*'Ȉx/n^]1B`-ן>T0;Vk"aNkuŘq 8^CR>\$`ÿCk/ّ_&9n ϥz0璻:,"uvw_.0].k;Gc`_5`޺hf۰ۋ vOUj3{1p*0 &n#= a6 e,3>Q|E'PHGC{]t<\|BùVR}Eiq&"(00vD/?S"z`XW @$|eЈ1H.U:ςriE.0-G[hrx.,;@Itq9{\tl7Ӈ|k`pu|&^4*&V S:-8!!E*U"ł#\l D"%4GKd΃qqQ)a)]CrR S$h)S X7$#2 z.ĕ2MBT:$Bų ia,x5Аb.HTš="jkǑDVUnE4dP %=er>L*b9 |bs2ZcN3,{"?A}_~t[?GZ0 ũh J. cv۬H.J{#LS+moT'RpC ]Nos63@Ԣk qAd1ޑMS>Y*ŞRfI=f4R|{&2CCabYZ5:'6=sT ݻfK[ԈlO-ɖ$f{2?&XL'>ԙ>hrg'7b$'k8mfNN w1Z*mg݆8gS&|«OOlޠy4䜊)0e޺El%XA[וo #\51č(B>Ќ~fV3uV;|UG}9\XRU)S&bzu|_"-fF_58SŭO=a½|';Sq3WK}]R__Ƽ$ ]\ /:c_Η1.dױċeSP H9yَ!~DŔIyBh9\5Y/Lc gR2#JΈf m SԒ.HV~oVpD`4@}嶫<'bBxdITOF%ӱu%:M.Q|gk)? |lçy> ɤО9޼a|c+djh2MI$<}otM )Ҝ⺠`B59*;e:.4C㾎^B퍩!p_K\8о %/A$Kѳ@\")tLK̵ AE!]S1/5'].vjS*P5jv܄J"!.[B0FyW>\}RY0zikK?e p]{7~ؘ+3܋,gOttjXbr_'74n??gfDgf;-kL|V׫A^ LK;l?qi pbqP6 Y>uߵ@2GN<ԴEٗHқG6x * ZY@>Is<-W*o +`L**u9X H $! n4L|t' .&n?c'~4w]AxWШ|Wz;O3l7HAaFxQ4cg4DŽcɥ*JBZ?%槂 7os`0~p[m'Thvi7l,,|iآk0v'?A( W֛(G1>iQ9^fkM9ʑ qF Z[F&x֦XjjcH2[[h4RR5lo ְQְ2? eΉV!坢VQU3`!$D\H皹vVva*P)%F)jMI=Ӭ)scX3M:8X$ fQ{,7a[ C^3x)N`S|CJ FνCm"QH(FFHԖ#!YP,X-H@@SX+(e⎧rsw6ݑh9Q;G34qAS ,mriFhW5B#DRP!F`J7 Kds%&WxǠqm0& bǶ"c(`l%l@}}#d(kNHoe6EJIkŨd$Xxԩn>>,qGM]h ϸcO8M(0%ϼӊi,x.~^< g6֠ ;627df'+"10?ԢA_VCw(CQ*S0lGl bo/WbkE~?T/2^{l9su dq7iRi3hZEto`L[b1wFh`!jo Jg׌roU(^f~C}NG*0ǥiH/hmj״PAB-뉶q˽B0%ͭ*&bFg-* ls.8Š 0 [ q >RTKlhEtOjIZ@hRt&1GsFL K>8[0;0'0d^ȟ؊ iIz}1.%Ƭ>͊,-k3B մ.)Z=,4bJpz(  )(  p Ʊ#ucR*cڂF'`ʖTq$4s/׀Q]\+M8N(HT OS*)v!-95 nkĈ(J oq~%EP`9Bsp] /l4%yLp>{cA7-fTxPF"K`,w5'ޫQ҄,mVG y xgM陖l0[3&)p D;yqj Jm_I7|X\2C̥Z8_ HRQkp"Mk JJ߃b\FόbEzOԢT?qh}XukSd|U9z :?x7O16p~aᵝ>Na.U}W=Vocy\MMx/&ziZ!(B\Yվ_r`{MܸE><|L/K\yqE[fvkTYS,sv#UIբs"ӷo"<=bEX0e'33O*Ⱥ^]]H\*,I% VI\4"#DV`i`BǺ+{'SRrs3FD :eu/j$'|N½Y%T)nrDaf($t'@ZbF0Di:d$IZ nkƌJJpML"z=u٨ rP@ksauw6_8z2x8нr{kD$l.d)CBcJ%Pb"wGV~߿?Ckfv1`9x ,Y8ɖOb_CZ:Q ѳ $Xz0;[G[,kGKY߿\~ #Xo"g[D YR=*ߊ(L(/$IٜjERP rzIE2U(g5Ty!JW$#to?m'n(/")wg o1*?c k=Ao޿7U\tٿػ6rru%'rڽ~IJ`K"e{CRIC )-GƯh4Mj?yCjYD˵+[ckHLQ eౠnoVيN )dVYjƱxW}[fSklKULrfS!Y^ߝZ] 5W|^i+mRi\PJz8WڂNz(BvҦR+p4asi"|B\M?͞]Y<&wvYSS8>!Ln;ez(/[3 Qw},  kqRYUBbm )*thQ" .%b/N5w&'_KkWϝ/PN@%Ȣ~r@ id4²z1)iY:9 GD@XN B^G4`/McsJXHe.AXBDn]We>p UF쌁#LZ?~]*ILa??Mnn\`֎\?^f0#;x8j cpQӻsXOclVJ/IE!W,\E2R^$,2VernoI3뛬'vf>z5虹%gh$tDrzuɳ?5'pXsCq/6)='ajM`vh#DgE'ITp*'.E*vԵhQgMr%%,=R*$H eW];교^_$eLdc-ig7k"Z2)c&jFNC? Oq\؉MԬq|wk=t.JY(MrVXx, (ƹrԝ!3;ÊÐH7R%HGrYjdiQޭ_yt 'yUѝExC0qNąO `/v{+0ΘL OxKtn;ޱrHSƨyioԶ\a~8T8EŵK{ Nl>.0YYuꛆOmzc|z6Q~=J;i&87?eU 䞪K׷%[]֕˺wՕlᥢ'0b$_zw\Wݷ?68?ԩi󒼓qGgZGa?0"@%k+U\eW ܷk pi+,Jd2O꽖Oc!ٰ6;ד'_V6I8fo.FAskͩ[%^.蘑1ꇑ0vOn°xzpnn'& -8ʥ tV~̓\t\e&zE":d2eKp[`1A2AD4H2KX0)(s LxD<{}X;o&)oH+U[^IGiAIl :7IHKeڛ! Im,FxRDb-t(Kl)UR(+ qY"ʬ7%I.XfT3$>sfNMa{Nj\ |{GmMRc }_mV$^mۅq=Xbk\&t\;Jae绶DJ$^*%2+J?|P&Xꂧp%|n{}g+(Qofߙy1TH zS3+^ꯘ5_)SA \ VpٴLk{)JdA>} 퐖k@g39I+|Ę8 ltj~vu@uIks^r-.l|i7&kvɾexrѺa:"tovTgM¿0"} jrǒ_Z%7&wA: 5e]=BPKo.yx@֧2~7^ c]u&, Ud£U0&x!Jnos*|r{;/LZi=mx̤Yp±EC%(z.ہ}J4b0`ukhdN;VªBaUJ hsD*Ql :_:ySkm`;5Eǻ믟҃I|7]ɇjYZS} 欓=R뾮:+`o\xr;_tqִ%U:kXHaw%ҽGSXoL- qB0+A/ $Vl_䘠s`*8ֹE^bS.ŴY U)j%=b&@k0vAwe'Ĵ RAɒ:ؘ/q<Ǽ -M XDi JboAuҲ_i\{B5xoE3*y<~z]l6^vf0h5ͩ#&"Tq,Ya5S6?|,PEUCP|0s}aL_7u .lSpN+$Zo?EQ>!qjf6_=!*Vc8&vO770Tֶ"[a9L/<\bmL)|.2QĎ b.ŒƘ #C*nJવTLkAZ Dnv% +b,C+EnVֲvۘ0Aj h\IW})$qVDٶtΚ,7\ kZz-фXǼM}n'VZ)]Z}*{?w]뙊n ˾iV۟nxv;=R]#7=0v~ۇ?|w ?=ݥ7e]^hFOTo?t-=bc<2@; jz&էѝ0^EV^bs eB0cGA+7 Qu%?t=^HqƦOXb7Y 2ҵ)-oWcD5~TMn‰O[`gS{Ɲ<`\:FK]vn2wn|y{"w9OԲtj:h ۋ,5V<{"wѬ*FKZx!03ލk ;+U [8JE:i!+8j!.u^cߟnt(QwF$SЗ40V:+5\a <JC8FHrlH*ȁJC^x(EN੕ QJsԘTJTJ֙gaLM@?Uc=J:-Mtܙ#LmT URA`iML/~S&s̊wfqڒdʶt;q*3W4 ,{yiҟ+w> )!%4Rg,"9o5^O+T2=v [mXe+ AxR lI'"$a FPKp8<.&24Pܽ立ȓՠ.,k*8& .fjY\;HqM5c?z9z@,>-v.M+F~-!>^92=8IYvɰբyW7%寗B$`;^ ?R+Өs"^cda><*IE Ϗ!#g믱ʊ,z쵩w7}(d+֊ux”'_6goG*nj.v^!eIU15XNCX Aoͮ:`9ﺖ'ɐT1SyX#:Pd|O;Ψw(PR7Dpܮ(U?ewLdB#[p•uA}׀Ox7,ٗ:ok"zofhq$#߁a~0{`=!}>^AMX'!G}%$ø/ВW Qt[#@QsEIӎ$mv8y@0;ޖ=/:]lڎw&T}+ρ`"_ o6qz[zUIxz0-ccO~lճ 'F7sٵU'!U G "R~(o6^~{Z_l}5 \, j=!爫F'J5"|/fBa'GzɢMѦ,;tfiLW1¬J]FA6MctU*´ڟC ĵ|@lyA|61Tl:g4@:7ns"Қt-悢tV_yyǹ71m3l+4xm-C d+z 3__ )EX0acb1 !\%{fHo-ЙmZ%{L:n] ZS3qB8ɅZUzXndX8;U'.ƊG/  Ngιx[Ϣ7e!6IR~##RZU@a;xL=޳u+WR3s/Hps|i!ioI\w}JkIk 0X:: ͙{{I!ܳa۟&eso|]l5~;yNwΘqrg)ҝGF! ܃ ځmwJIq? . fik@Szð!i ݸ8BC\v 5mTvⲽ@6*/*)|$%> miAZJC\s )e+ҩNۮf茸Z*wfQ!HlB!g*Ũ- V FVAY(F l^Ȭw><`cس6; Ba[@eT-eݡ,noڗyqy^#`N}WSYkq4T_EW'1?sn. $b*gUϪue DgDaB֜V( P*A PZ ɨ1Zr藺ZRCe.@#Xͪ(WceSyt>/N=NR12t59ot}pn޼-qm-)M[/MAq"=$$@D$i6JS^ds0e6jȦEӡ'W{Ew䢥PQlw"eZZ8KFp)Ipq z A0vf5{na?uӥZkRI9x-;"$1};cE!d]TjPc`vwxV4⟳K_G[mTQ\kMk%?oF!ߩ2կy| ~YyV#Ə7''l~4r2s$?ߜ_oqsvWVaBZTԖHu~h~lN>}쮧/ kfjUmVYj_9VRUG̀;ySE~G],!b,D>rE6_C ]|;d#;K>mX,ĵhv1"|(t,10KP% Dά/|0f/D2lgδڸź |%Q;1[1#H:#tYO xo3q6d*}Q`#8MgX6YQfrl:,:|&VjAN nS7%~VvVLق4Ei, |q.J%K1K)Ux RbX"99%J<^ ө6%'*<0y5Taт*wjoisQ_PbE2`6'tL ҳJ`Q.2T`ce R,)#S.FlГ+VܧW1V6{nhiCZPy깆Fֱ8U -HZQ1;qzuz*҂Ѥ)1T3, $tSmE7eU3FjT)H4Uޢpl uK,o B kZQ TcH:)~gþՇ-8APxTʞ⋧b٭OJXl٤zE=;ǦUx \P:պnV%F2&%J`fc֙gخ-[bF@&E᫻X$ (K!R$J"yRZeHyi6%K BqF,Ŀr%<əX-iFǤfѫcÍ-dڔdNp3EYG"X)JڣKT/XSeפbq|&E%C*Y.e=7Hl4F%6?/b5t-M;պe|&oN'C9~;lw] ݍ淫E\T'_$DZG4VLVq_6<ٝBh:[9H_[smUO]i+}FXv{ NI{q'KFs8ۣ"&ysgFrۥx]9 nv[ RYま_;%B~Yؕm]&? zf0auC6Z8+&Rn^D茗ޥXo:-QT@kC1*DCth <r1P.IHDTYL"C1a f1 ()I~Q=?rx\ *9OMT`5jյ=[3d7QXH\cyOFR;k# kKԐ"iv%9#0WI`u j>ǘ 񍵄H({kPk+ɠ^P`b$`/ƛNEt_h*THZ*x㑌Y (/9fYFIZXrPljƫʱx6G{h,b1촙]+j7. ;=+3pg;OyDŽ ԊNߦX^"ѐa}D9u"+V f1M0*ᇛ|yO9޵ixuK}vúӣOn~bo?NOnDB^I['|G2DZɳ A^˟:EQ*AbyvUNߤ|[/5l>A8{过m̨w4x;55FC MF9A a7bFT~T?N99:>?PbimS|Joݸ_=BjAv ]ryrk`_~I'`r1цI=ńKnq3:1^:Io!O?䟮IEeIFxVͭ7"#(Tym%ط?Sf:~wvfp/A~ZvJ?'Es\ܜ  7΅n.A?s? 9NM{d/ֹL9~ǥ{xխ}y~~=~/6uAsʹxiLjCeœc쪹&v9EZ3Cia?J|*k+jק.ӢP~LL,QadßLNb24uQ q)ǤvwXU{D+8[OA:Ɉ-tL)%tOp&K[%6s8 Ft8[)3|޾,Vbc`ZMi<4uS2hLȾDA(FF:)*W.gc@Tv^蹬֙m" Fe+2AX&E7Q@~4:){S]7|,TƸ|5Bnڻ ]e ` PS 4Z 8V.ba38 *,Cv%Wj.x:s.S)sN73ƲOYڤoۢ/v0Ѹ Q`=쏁3suu: t#G ,:Rx6F@"m4=TF6ۋ^ч힟( n @xh=4YRoJ;^^˚+P-vϫxr' m!rc5:g(qP㖢7TVBg8A+v;ADs;^3(n GZBMA^-82|)BLE[=(eSJSEEG-NICٕ,~k Y1Xh)n&C[kpc\3XfE>yZ{":/Lf m'->DPOA-B]G6#T%:{l7bτ\`ՆR罟Ybcn[G/m &4Z qA^5c-ם[A:zAW IaG' ݴF*y}5,Iywz,-"_1R?K8Z1HzpCxQx4H{l'F1Ay$edh<װ\Sac4SB.scaҭX3 jRY=%-i0H9/)9j7v8?Cvz_=[M)8FJ!?.9uͫ3{fm^z+7Ԏh8Vm+Exc@yL\q%%T UJ>D0{#m"β& ~=| V9YbҦ_Tcz 6EXV7IgKZ SNtהּON;f4Y˩3YY͊jV|VnV;6t:bC$pj*'ԊrJUd-i@$(B2΍3 h^ӂOͿk?&>9,B#Ur^XSh8> Fd>W"e2 ?{WǍ/;܌bHȧwXm,&m%C߯83Z", ɮvwXEV=յlV*I8 r±ڹ'5~9x/{ ʲ E׆h}DrZ|Q딪B4՞ 0_f L4pp%dK~ZTP')Jc6_~$2jO(ep(CQԂ]ǯCdGlsB7:0ٕ9i#mэF#ColH30^@يLk} ^Tұ>UX+~UbA Z>*b9jBHQ)5ffUʥk,|t.u,פQ&U0RR+2lQnc%=Lح`#ȴ=֬0ЋfV>S+ 5y{+΄fO5/|;rNv?8I,3$q;S1 _.C>跠uIos!B ' GM i0Zngr "1-)]iuPe3=6w*HcLvr㭤:^O9PB^-oE`[-2 J1"!HJ 1mR͜b$=$j#E'xY.h˻f@+; bq8lԩ`|Z}$# &sKQ#cnǻEYwoeiֵŅVS!Dƫњ`;L0.Lc'"=w>Q)NK %aXJ>0ۦ퍾N8(%[7oMg)_Bą' %'SN-|SqY8s{k^Jiv6 ӑFGS'fn8"{ۺI)gO2 ~xܗcY]8iVjȣP*F-~c#Zl8GjoYQ J[Z0X>2S[CsY<9K>>@z7š4u|Xt \9ݧ^MX7t%dZk4af+mQp{zAw5$;TN18qY#l[-/k@̢q[MgyE9›9&n/GVӁo<m#5èZ+ OGNUc$CD%y"p_˲NSH4BuIZ\xqjؐwM2!VC{{vÿIh ؔ-̓-..C-jukz< g[N<ĝ3%`L> ;I/pイ4ZJ1I|MۓshN9K5{Re@|k.,Ʃ* 3(ǁȺۂ+2754-xкlju`6"K%ĠoCm5" !Ӽ+B4z@wO""mnU" 6gooئ37i@WJۻVp͖rHꖑ5?*:N&hsR+.l!A!:IFllZG#6$$:Uv{Mj$h,9='lL@e0e綐xN` AKݹĒ\6azWBJs"HXM%H/<({:3J (9׽KNTm'1ykC EyG~jNn}RN6ɜu@> Z SXyL.s2e5,DŽNڂ3p/6PUb퇪k?e:ӘRȄ> $Ac~x㩲c*es/EYIAWCj䳂R{0O/MlAc"@b/^o|Bd]{ڰV~n$yŲw<6XpgHտߚˏp^\%Y+]b:&GoS oX?yG8)Z 9\ޔW&=#Ng)bEp&Q<&Ϛ`[sr*nn59T YwHg9.^ pBbFY|&/ "2]p^cbtChiyi^Qvj3b3LnE89Y6A$D+fR02od|[9g(|uM&# >A1S`<{}̭y&zP>}rgLNE;Sӳkq?ck3kpA]n?7 E"y< ߾ݹ* XqB=?o@0熳o&8 ux~J( ™%.Y |K.{.PI6ץikg}X:jcm4évHV^V&T;0^ 3L6^˶^O뤺%xVBiT ܵ" WE қGND`TW "!܉1jRwmʬ>~8pis:-9ox )D1sPϼ!'5)s%4/4u׿74ј Z-e&C4G"R@")w<@~QO0uRsy`9c T餧ͽ( g)`*Ff[?ͮWLJ'4Y)1/Q:錰#G,o!v;Irz{Le)9*=m^A[C atO&ޖWO|]fIUǖmO3NdXy(`/AgN:wj  Me9u Bmmف64 }:0 dC!6R4Z5TSv_=d|8șF5ABx,GcҭOhN:*}8U]R rnAZӫy!k^rv%񝚗21+3_\QС%8irB,1);Aђ"1_I+MHóUphKUp &\g~2F*ѨN>@}C5ƽnN^mf˞uWW7º;e±=cՇ/=)5|?WWl6uV?}c7]ۻI:X Mc{}JW'N2zm=g]O ,+7*Z 0cHڏjec:(nGLYfnՌz1,+7*a'1ѮwS$[ rL%G^ޭ\6|&:ئ@XPeo| eY_C@eIGtO&c*ЋLTɞMo&Χ%G˄z2fU,Mբ891#cnRJuۅ{*`52h-IhN)'XpmA,-VqP y8O˲W)`3y&# i䋎Hi7 ҍHO4@ķf)2dN`Ʉ 7Nxsl34=hA5mZkRsʵ!Hk|kT!Rl)Xf$A,%SHRFӀƹkt6֒ZH/([cl1M$') Z~\`8GaFs8KדcP-C@Ȇ.ͷQ$%)\%\ĜIc;혬!l#v+ (n8LK$i%wj@̇(OpKr)ew̲F,/9[?fK .3]#r"ꆪ0.'[%='(Zӏ2*|؂jSlEҺwY9cqq·0|nX?=)'c$:**;dD2%_w w&sQI $F5.NfKѯ>R- 4Mgj=jc*Nޑ٦Z rjQq́uz7(Sw?cXWnU6+>}z׻ujec:(nNjXFټ[L6|&ZeS$O>niޅ8V+өFw;baH1wf4׻a!_nT&X R"XF;u~ (' r R&ʩ._9,g.6dn 0Zx0K 4 YRJ F\'mT NZ4B'PV;#Be83Bj4׻_nw c6C$߃ >o.R^°YU 2J nJ*8UrP)AGX'0bDcfys i7m/9' ˻^hL fNm3'.`C-x:Ia;o]+?FW?-_h=rc>zv@Ȋn2;lh7^W'ԣb&\z<-Z~ҡ32:̋Q zeͣ;FL/㑴\oų=2d)e)M$(W9&UYbQ+;_&n/ K`Y&هd YE[m[vKʒU-K,R^≮[<<<<Oukd88,!і)#8â r[4μ(Kuʥu;(I}E=G w+-Jp ʵ7ʻH^(b5^p>euCU"` 셨Y)UvzT0k[M0jq)cQin87@"jOtm 3#z;^mۻ:[ϗZb8Tn+KijDBZy n'RI= n pbipr4zk AE^j)K[%Xu%ouiy&hhӶ-&^XKMA#[Sa0trjY94r寪[bïn9B%}f3]u|eTw(rQ[^u|Vnpd\|xWո#UF!"HMa-= pj'{t4F*#6>37k9j}}2^y[Npc!n uLR*id~=Eݤy '2v֞LeO,ϛ_:Lˊ%y.۝ǵ?G~+/9t.1/e^t Xۿ 韶eʰZ8?gUnf&1R|?wS=tfDTA:}ob%o2C/^+إ@SL 3XؼS'_t&Z(N}ǜuZ^.nn?v|#ÿVF|LZJpoN?tN|bxhfϟyb!%Px0(&up*g0C 2O[e _ _1}p);G:c6I>nTؤ&;>Idtp?%9.%?\#-ذ+ci(\RĞ8M RZL`αp׈/Kj>-]ۭ.st*1V<0.Vew&{W9'UmW~6LsoH9~1 <7gD8g1[Fn[dh,d{8>Ĕg( 7nݿ^kylF0T%0F M9zy@BK4cbԦׂJ4>P&4MǞŷS05j,EwbP[+R|IksDPT_.'y0 V6G_ أ9 .lJ2I#,>~xf㍺x|'6gaoAD ,(j ͗K=woGlHy 18x!g_Hs1O`:hSa]U%c{&R*!m>~ޤ19sy[}iD.Ux6 FHƦjޟH8ztѥ:.$9FN`PNU,"aii%ȪDQHʂ J('W;+]uT; Quq *IsE{` G/n&p'9kYZTZyuYnovI_Ȗ&6ƖFj dt ,OtxAjԶ1LK|3Z؉ߣȅ('jCsDR 9YsaS iB҇w_* O0 cN`T'k'@F64{;iTuO&p9[+뾚d{Bz)2X8f 9]RfR/& A(EF,4l(Q$]nn- jNޱjIbՒBքZ">D'.]smFZM`ۿQ\b$r hQsg0flԺ%;s *e9Z~[(>)iN*zL~2m`oA07>3ޚ(R]p x:Mѿ< "DϑǷ 6P8d#d!p\Ly%Mͳ'qj(+0m HR2{SP\JH3N\#ysn鷫, X5Eg+(=(j@Bi%knF0C?= y 9QHg䂄oJۀA#Jv7e;e락=F4 ">d V$pVH BkUڠc<1>}: L+zc 1wś`}J,`T;C@e^8cgs_&=1tB;.sV`LV\mMiY ͚kr)ejUpRI\Eͣ|t0zsCcfa`S{`ˀ1/\o;@:H-ro(#U򺻿rMM'۾!~dot\98МH+~d.ƦhoW9x i. mV4J5pd1Ֆ~21 ; m_e!Ҹ[c?ҸySIZ[%cȄES F%coq̠S՟ ު)XW\jqytwGFvӻ+BcY$֩6zmpb)}Z6mGu\N< V4^6x?9xcjh…)mE^a?꼍]~fh?JE6We2N(?uj~i LgCa89ZV7vx{;;/e=;L9w\ 'z[D! wnRвf]&xplLޔ{U=sU \5j5R(] +b`A€J!˨C:`Qjj+{x5ƚ⽜]0+NjÑ|q~̀sOqHHe`(NN}GI@\14*@Ԣf)T>~?$spjkw!7s T2]>21ُUe:XH:d?PEЄFy}׳^p/( XYLbE ECsWJcZ萩QpT%| I穽̢2޸&.fcd\M}Nc 69}{(pӣsuq[$U=vO|9t6^oF6R.^f%[O9d1?ۍ8?rnJ 6 6!K-C: -˷kSjЮ)2Sj|f$XЩfr@l- #D@92agI[/hLjv#fKp Hѕҏ.#]IU[V5̧%[<M(7 vhc;7a =}&+\k"JEJ| *ADFi-FÉK[=w_. #Ҥ!2bxHPKᐢKX+[㈍؎1!8e =̀e-ηhQ-{J^l?-NNT+e ?=a.;avfLgOrVr\ny@y5)lVNOMIt󷳫"nFA[`5j839 S0 }O#ip+ ۉjϧ'f7)m51~kk{fN{[O}Ƃa皓3%  }Ao3~LwBU&1ti-*rE ~2I)ѓVޓ&V4狂&X~ 8dWя[Ï+=a .F.O1"iEtWۃ,{6̘P𳇕6bJrq1rw2#$pj!2|3B).8j? E>eci0Ȧx)-S lRkoF>!#Kk:-aFK/B˙nuphk*tHNDɵp"mnwacK[e[ "FVJlkI_&kxyר+a6}PYI$oǾ9]qJ#ͽP.0 R6 i`U* Mc'F#V~.Mo<16 z@^aׇ,a{`,&kOHlJ?[% 8z+b[T$,QNjf wμUYr"@PWe^.O#DSfѕ<>e>WhY=%՜:ze%7[/ĪB1L g T)`~m tOx^L_KF]d{W KBȲ7tdJX ~b&DϴQ,cֲ`BRNԺ9C{ʸyJePKi)k̄[&HDH9S*TOO/d)sOOP/OI%/kߦR#kt*wżBxzU_-SC78VFD,6 KG4PNk5|aXO}2QF"n-WB張z+T݀' V$,16IaXr"t% Jɴ,S-"uLƃo/ LG(QxaWIj/ez\pԊ}v_fȤTJŬ_V f$ƑȔTWpUP(2O#4Lif] 2('e\wv0=%2/tJIfxY)uYgm<5n?=[Ĥ0rCTGH vs3F/,wrM1kB0;`ӯޯiB{v!5 (ZOְZYȒ4C$ҺB6KtϚW6tLT.9IwY`MޜO//_[q?dR rumx;LfpA |֧T:w걫7kiق|*/JZ׻ JQ]R-κUvqu+t#BYV7AaQ ,YK K~-_E9CՔd ђx WG;S{~YQqU݄TJ4䉱Ŝ2ANȹ5#HkDT*'qh*.py?P  "bQ`\4i,fn) p֏gz7=1\"|GY!;=ƇD*ͨo70-9 B *1ܣ A̰CH=k9V8N;?Wk5/ eL `}LrL 2r\Kcl_пa6{#BF"ϣu1d2*DrNVRڷ[W"…\|욞 ,:, OAv!mU-IC^-LE ,!jYIA'%` ϱWz8aVAG'Z>ƵQ+?BS^ =@]m7C<"رÜChn3xgpI7 )P 18j$:d  j`VFzj[Gj #jU-; [ _P8lT'hʅϯsV3zEBѧ*:QR+X'X;߱Q!RbdBLt(x q(B}IE,?jE6*<}KgϒLoL3JH}*$+ ^8=@9x&9>p{`D%Fۜ1+TcQP-b%| AJ f3-EŁ_z6(Cϲ{PjKĤ"/w'3Z@o t"dP?{F/kWtB&OXu/G,U?}oYd▥ % (pIcKqbg 1;x_?OWuI;!0me~0~v8ېټ3 abڏ6/*a]f>h#(eַzp 'Emomnu6|QhS4A˻.yz]gskcs={՛7lu{n߼h^z c/ _A98ҫ߶[ VFߊ?}m gŕT(4.k3tez p{ Cb3.-eI7.)N.邩=ݭPg(~c3{?xTc}wM0ETHl!([a~.K6|̪Sq^%yBŽF&_6\*|\~-}Z}~[?w>hL;W1@czLlkXR&˂!MݷG6"4PnYúXb~úհu5kqk̿w2 % % % Mc0/5Oy͕}FΥeE*,Hz͸SB#DWFsNq_'\ap,TMrG{&J,OG[~͙zE@ v\X& *UMIjJ:\՚ֵNv-5Ǵ8-ʴtqu+22ߓDՑ|>F|-y)Z\"yՙ[;XI\˱_A,obLBB18<]zڪ&ROv`E"xLv|bS3ʀLq3q4-c>@=!1hd5!,=&c`{ J2jat~u0 QdF1Lif] GJ,*ӺQ9 }dJ'EMdg(F`р 6ZQSxCDMR2Ixk^He6 |=KIѳ=KIѳԷbJ ,bu4~0(}ѸVM1c2 9'S4.\ ,W^3H+NwNr2 -58ނa4AJbxٗQ]kRkJ0TQG$<)U{,)n`QBt ]hB"#Q $8MO8^)ٝ]N R$t:(Pl FWfVegVB}ZM$YeCN23+F)ٻ!j4ÉQ\G,KrZr΀mbu1{}>_FYwK8 ʁ5U7ao#`,YX*7XgLOYek2 .K[ws:SFGm)%0ިK(Hcf^csRxI%o sCN{@ҝ -VSx]VR#g!k/\R [9(HEzGVQu5rɸ;7N|@ z'{f&sҞT&Wad'mdK&Β37H M!mp P LE;!K>l+C;Ru*HԓW۰,EΕd3UHTSz hcæ2S\Pl @Lj'I&LO8݇c*.N/O(DU"BeO2 SI+`~ 4v@4h4phT|rS%BK,A:1upJ ;W2MvQ7;sfA­iU%ruLF(S⡘n1i]6.%A K@PsKV| auWP||Z\#O9q->I^Xr)1 31C)*S!!DB]'XR?ϮD߹{xР6v&/I @9p'twϲ &fH+7iˉH噓΄*~-<SM$մb!#Z"غ[өQ}*]K>?0"w߭ے~}-C-Y.ZCuXTwvP]%h(@55Қ-hLPC&>Oyyt&%S.[b#SrS/%b9ԊaR70l)hK &(\ã0 ryy,~tyJS+rm5^p%o/k8X%I{ZS?I)dY4|lߚw _;úޝSo ڮe0Y07q|KOmd><>93O *sAOo@[&ҹ:3~eg/kcde䋳ӊ˩5a2h`1x.^әy>ts] t0Ŭ V$c<&(Sq!JBkGh:[ⓦԭr?N5[YO*_s ӏ?1ת܏c1ɱb׏nqmVsBSs ~9KncXblv ,x- V!l{|xL2EMIsQ9qrwWf!Lbkv^ܞ>&k'= nOpGp79ZȯHpA(qV}nK{WiY7f ș}W4YCMFR}x(lS(jᲀ{J=LIp}~'1Vi?ul?Az\!x/ rhjb ]^T cK3KFu*{ԅWEgnC@$oU[DXr!>|zs"zSWB"`͗ O4!>MDsO=;l$*@IV=DXK(dEiXε}&E(mV nQ 0%lWmNqSh[{ȍ2X7>@Jr&G9K$& d\|csp]DhُRIk9S-7_Zk9Zr kh[U_—b!JIk+g29Y X̱~D slm~^TR4;/^F O ^ADUt-ʋ O Fp( \LJGYSqtj<>?L ""`U9OI]|\es2\@t̲~L=> i\64 *ǜ UiW*Hx8lM49ʈ;d^yzOǓ܇j:邛k' K& ׋tsmL-h[QP%F&"NX3=,O$zh 0 `Tfauvd2y"wFFrzFY)QNۜMH)So$௭vk?x߾{9'}RdWSRa^#-7d!I?{WwyIeF-%FMd7g7`Ɣd!}S.fRo%)1FooR3?{^/ʘoBS:S1X^e(Ҋ*ޝkS&JRVp!q3gGJDrngLx_);8BmoN,^=+*L0翜G r4aD$}B1N"0d|!Y1BB8 EKה$̲skqm<9p+y.1y{';oX4%2@P4uRID8 d4J(:Dͽ{vVqg~%W+ f9SD#)=EZc+_|7yy*cie zi1_F3CcρKcy4 LJt8",5_Vii /g/JgthvN/.gUN^QT|92ň큐Rˁ%KW.[JZ,w!ݗ+!^%AvICܤ2SM=5QjEWQ %kۥQSw9io^(LPB8v08*UvʶR@ɴ\+@Ca5yb.iQLpuKԇ<7oE;w&P?./K{^FN A ,*as 'l`y{Be^E\DmVR ^9DM<1\b [=xjaawXomfb޾*gQoFb9T05f@&>V^ʜx%VBQ')^0jNiՙT*'b#Fe'l8fCywX؅knwAoz.FL읁^TX!3oc4 TzOaLxʂ&m2\8%ho`R2*h`zF;, Rډcp3shO:>ї%)ttphԊ(-(G#ۘ B+\b,S^Jw~OLMy#f$_$g骧)7; (!)&Csxͭ한 .u г!Eh AV)&e8c=B*Ib6Eϑ6[DYJ횛|QbtMD|\한$ClM": 3" ?&tt' <qq!jva[iЭ4xI!U!tIo񅤼 F(]6d .CdB(Kٷ_]-qmK!5EށL]׆V7MJlj0D蠌 *Xg]HdEXzQw,ybi=Ov6# @eʸ]3z=5eحFhSWdDZN5 3 `tz%gAHJȵwL$HՊcyQ^i!7 3Fk>^(Μ32&-SAPB: 5 ѱK+/OcԒ >JeQa T d5PK5\/:s B^AyژX5 *Թ%ˡAni֢&p Cupn;ܴ^$R @VipI m_NB yP\>h:NF 5CLT8,Y,rҟY1@_ȳѰxG9q0ށoCOWouP#$,Ϝ(9J%,pM}8 rZShA]y:íwb#! HJZB ?^;n:8yPnuP/4IE ~PqZ~8Zq%(-B{)Ssj$(Yc wTC5HD|Po}P+J#6-V@y\9w݇3﷊B&fÕt,J b-fCЮ+AoT1E0< [~ÃgQ;ت8)0=]&H]%&ȥs)́G zpK>pK`Z*qyCi00[P'7FY1M%~0骚5FoTaQN@ZxSo*fѹH*u2U)}sθl^AqQ!5$Fi fzus &6NQ~$]ieiMߵVGZH;8jV-=R +(,UVս-T.3{Cs"οR-bP~0WFA%V"9bDȾP1ǻ+QyEnES^,C߿tXIʄk'rr*ȁTpAFzkjNJE~"_ ̀ډd}p`̚&IZqN߼|dA&llӱrmjƞg {:; uMfw"J$o`d+)D<g:IIp9HҦ:.i'7Qe )}ܯ۹d}~@B6};/DKmC}9$MZ>Y66}BR /Ĭ%b-עI;} #kޒ|&ǒ7ߣ s%Nw΋'YqwTUŰ#YwY>(TfpOF^]J1D.'NTKtᮯ7'_0ɬSt'[1k ֛RzԿƍz-08[m KClȈsڨ}ߨq/3bY:UZoMrmJ'n6al+yUƛe﷙'1uYz >D{v6Xk^5~E~!Ƣ7˵\(:.s{>gm?oy>sX)8[3P_$:LSz^==#(K+Z5ܔ )SUDTD;Ƿ])皷7~ic^Camn|>q8Og?}1"9͌?:Xm/KPUok9wd2=e2ĥ$ő2{^n;_/| |y' E::wMUO{ Q5G]QjtuNybF&Dp;z.fkreחTٴ+^h{9[m7@Z-Dmq|RcEt 8mk![!uDG<x>{Za=t%bWt=}g$^7iY?svsQ~qs1[A t=)im ރ+=S{jϵSŌ7>`ss鐇\/1~XI׫]]/mkVoKջ)9'qKihk[&p̯Ѭniu3F|,IY'o7~e.F8IwR- gku]C=A߿zn_ܰ'XMLr59]|.^dFJ>skm+] uR4>D]:9s m;qA$&O, =I3>7O27dU_po&Rf s7E\oJٻ9yϷ(T:>-8eZ.z3.:>h,: ; Qa؇C+in蝼Y'O]Gԣ# F 9rAcW:oOB m:UCK)]!Vh1N4-O&cZAs mKI+]٬̈́xjg>ӧn&[KoyrzEVuA!F\T0h;a?fM@]Wv=EBouxHx@ #cZ_|я<|?b͝'h9Z:ڨ#7!L4.,eaX aqU._1*'srbM F#p&;AsJuJ0Z,V&e*^p_CWӭ!]Ʃ4;]ކr\Xgdf-% 7f.)2-IHM N{67Xƺ3)F̓.]3)@ =cGVA^{b-w[r{I4$SSiEqӥ |7~*۬@z 8O4w_>w;3%wۙDTQO7D/,{I.! 7FzMW,[.KDW3n~pNEd(q* u1k4={gwށ j4ӆOK9bAr7B *[W>7o l :kr-{_cvdz{{eM6+ٿ&ů}ط]ܷh3tmw{KI,ےMELY|qf|T**zW`^lXdY ֳ> UVrem[}|i.Ӣ?7]^߻T1Y1qDAg-Q}MeĐ@(j|AZ)RUq啨xiA(LR->G ?߳AOBXZT2͕Si!(XZJP+RrŒ; 3KVFW0(}Hduœ|1FAT7]< F" U,Bd)ɠ$sQGYΞe'Zqq2\]n+%36Q\qyv~A^>#T~vtlC֬HJ냳z $Pz sE6[6(&K)e~ $gkf*&^Z3^TU2Yf R@UJq`M7kK TCS[+˽y*  ^eRfRBBԋBb*,"fY1\ & @dyRX*bJdlᆣsA\%jL=OYlE`s:dh@ 3V5jBtM PCIuάg?q~C(*JRf<ͫi*E*I%뙀˪(u3ʀiʤa@Co407cZGeeewoI^͌r>y<Dﶚ} %K{FGRn{5ш^ sRc#]uIsbHD &'2QЉIlHݵ+%+GZ#5pkj<|0

0vo3^'߲cSeF4tr'{W(@tV`1*LGXOYI }/9Bʇ[$%kuRjRzG^.>QB.Tyx^j{LIqlI&Av]yTY֧vq1SlC6竛XL?G!5Qy%b{q7'3瓔bi؜XQ@׮=t_h 9'q X:nC\+gf,ӴL{?'(Os)Q D21yc+d"ڤtj@Fz9]`Y{_؊oPG=}ItSy2~Z]4oצJ5!&s$Hqc~Bo})UR A$Ӝ%DtY\\t8ӵ"S<1%"AVD=KuЁ H{J:sb~m?2 )>n 8.GXzC6\I"r6X69];+VÖ3h@p-6K$DzW.Ԕ4(!{:RO:Nab~"H @*71$`-4E*/ׂR$7LpQSІ)q]cW{HlE(~%Mhmx e|X.8_P!v#&' cs UnMReI&H`n&*+rDO:l7[7;g 98@}/%d4(EWמu6^˅:}lj.훦~*Y/TWf;o6jkֻEKgczN\qNOeʊ J]$%եiʒ:j 6nY>'HIN@jiEf7ֱaޕGQ{`},+:19n}^)FX:pо]u|v Nvop`E¹a$ZϯS=(H}RS;@H{屮t T$УiC]ؔK)2_1dⱮًEHs_QzXv .L4'DE<"GJ9ckD2v*KT#NPm1Ҁ>@s}i:A/c=4:SV%p,nfOL1y?@Z:?W V{ H깿=*&ΰj<+)z=lNA*@7u*U|6v-!8}RQ偽7!_~W뭝 {LͰu>aU G¶c B/k5&NzhnrУ.yT O밙ç$/qX(ej'$#5I@(4˲bf8gq`•_6p6^+*NWAy?v<<2VzKe0p%MX^wK3b߿tjsS6,+:o1F N!WFA;˽'Zj6>C}:ָ2㸾Y0+ k%zkZy;(0g@c7X+ze]^ ^ RlɇZdb;3~ɈsQ\B0:ϫ y|.8UQI0ՐA`:&Y68ϔxVlΤj:!Dr*_

p!DˀeV,, :g*\UjNaB.مD ]^~2P ʦ´z{Ddd\Β+ ȃT/#\e~^{Lh #I3cقٮ1HcH|0Ud 뼢H+&! RЀ#c0MAI,AhcDX_:#? 2Ғ5s>;{`sDzcg4s9Xb=2nfΫ0%5rjg.7D)LXw|\oHO]4@ ` ?c+f|1*5iyC\q}HIV9|u_[˟뚝}^߾6ⱸLT_7_߸qwC(d?V_R*xV.uUJiE$RV+()\"D]$LQ4!,)o7y|͛NAAߋ/i,Y#w-mjƢzlz qzIb.P[v$9iYm9e,'&pH;CrǁR{= ;NR!NޕO&xn xX ;}Oḿ7&-'N?n,ՍOgԍϏ/;8IQ f@ʁip^Ł_ |8P ځ‘A}J TcWAQn?Mɏf]y 4g[H]eNA|rg|ej&Ԟ>x4 s7;lQqJm>k.XwvEz{2Cpz~)܄\= K[Or]`tw _ QFIڬ\Mꭐ7_:.ҏ~M+^pߥyA ,WsfM;7ކ1e^] dUWC}?n>j}ϖ$Ës As*#}gybgǥ6NF&$"ٟF/69éz<' X_LROpur~?=z?Yz2rAX8>}8_41Q`PMnjZٗS4g[ի{L6˯,,W=i)xѺ&=as/DM'ğ%r+hAWݛGI.*[ ceUeYCІ{7AQvΩIL:!f6{kIYTћ!foG3N4FDv0)4uMp~8\ %cԟj:1F잺u/hS{.Kpn \86f^ ?A|o=$5%=@Y<(9?%MޅM uר+z?^WܸPBYe]AH+%**FY*D0rdW]q)uEA^W܈0y^W늵(B1;jfp$E}Ȣ*":(7F!7 қAu*Ma콦ߜ9k{Mq)֠)Re> @Yx$UA\fg`wCSC\Si}r^|,(gA}O09=u噖΂j9xha MW^Puep1P\nln7vz,•Gnp%Y*.4AI)zkה88CRR-'ěx 5(ef{8NxyK|+l2j_]5_Fޏ&٬lu>Y2nct͔jr29.iQ:`!{eZI(ja _^'*kB.d'iarSx`9~ywBRIS~ܫ{zBU W*^Ez3ٿ=sm'WaS{byy `~=UZjuŕR\Y`%L 7ꓯ;\ 91ݣ"of;VFGʚU-DTڥ2:\uӾzBm&VהqM "%" dm _j1zk*y7X`wŭWɍᯥso6)q M5 vAfKVhtzN5ϺkikN߄kӌt_.ڠۗA [T>p)۵cϿ1'>^u>|v]ܯ5 ޤe#ZuHDQlhڙѨ7*o!d}O&s;`X+ r.O}629*7kҒiJoE<¶n& -)!GoC:8վsK6YV]7K8vog;MaPǀn(9|Uzp(7έr<EWV3`pa@svؾp|Yȱ(Pag2m|6;< 'L,/hNLg˶2ʺl(/73(ɹBW:OMr,œu¥lq$qDF(L西Ӈ9t)_ysBW}٥ :5pq^jg?v%nk)Eجcj+*qJZ͔yJZeM*7(~6@Z H̖`D#779\@qLbp,sR&CVY:Ka1fQa٩-4v.|V+NW _hVsmZ j>iRRhߑ҈쓖$$)s:!) h}{ia5 7%WWvE/pX~ת&U|=h^qEI%cr=h~j-X 2"8{4 ftSϰ$ZSJ& ʧ4exW*˲'3\28xxˢBS,"Zd$b(%T1EkA[yff|78 U`3Z޵~e='?@fOq9MizZ0v.C3n>N0tP)^V m#PTv:|YS!\Lfyƒ~$R&De IT&dc ݽut-ut-u/$B\"ŤY¼w"FtbgcH P&RI3pEX)9|,)ἔ_eήo0QIG^ Bc$"LUW˫<O3)e-z5wJ P" "6zn2}< p}~e')=R~ ?ti/NF#UlY~hA")eW[ 9Z.F3ug.TqPbV,6tdbRsM圷77AV7z+gA K̃)@ Q ,Ol!;fW175RÐ2,S,]qY%{8 jBR_, $̽ mL+Zd\ aOxJ-c]];dSmZa1^FhuYQ)?I#ACvȡ-E̠+Y2 p; ,Ni NwhT+٢_'2G%^#Td]Ob +̱L )kl$ʢR:W̱ؖFXpG.~9?J!ҰvqbDZ[ޮ'5K)Vs3]iYz/EK+[1e̝|Aԭ)eZ$xeLɆ3AHO$ NKU j v]w)j272Հy.&Y*6%+"%5^$8j!YcuW)`+!{HA 4#45fI0,i(^VZJ(2}u N A@nr2JtKcj{_Krd ./.lCtk3zbKQ$R"i%`aόZdSbUX@vDv"nӠRiES2P$uSO_g3zʦ,d*'o/}t*BTd/ RNtbA+^TVK9Nr Va;+qlX>*Tv]#вK2:)QAX+f!jkj } ǥ. }E1HaEcjJĂ# Z+ ^S$҆Al EmNŸn# Z:32(9a_+QVE.CbDQq93ǭNcTg*噁<%Π!RVQ!2C 9I1W[ =ձ=x2 ֆՌ pHpe7FX(B fhdP RФn 2& F rS8%"8ӸZ0Rq2*.,_m "R~ȩ1n[zFfgjTtG&EbBL״BLDGᢄ⯘8NEզ)ZL#osg&P147duB'\,?+ܓyw/KeݧdJ @?d4oe  6sbcd/n6)>KX:B~P&}.ѕxMt E.EœK (':: J;d2=KPj_nY2%/%ZXN?ï! leȀ G&Gߵq]eX4 e7wyd)ΝS@r%š\d֑s[-QG9-Hd54ڏsIL굟D=I 4aDPb n(eroS`NK?`ɖ9:e2UNGQ3idŦ]rUadO̸-؂FVՀutfNf^K=훁4l[1c;gޛ~σԓk<N7p6sUEVmVenFEWi7ޒю +wᎺnHܯ`RNɎk,Pnp*i]m^(b24IO~5yJi5·)CEz/mD ^Olùfe3ݘ=j7⚲#Gy.J=\Gyfo6j}zfKQe9Dch-1|\ֵp/7J/[04]_G?.K6_㰨8^s J9p5@eB:2/"r4-HjE@-@} 6IP @KBAm5`{(5%; o+.tp!0tpL72ԲcCfvDfДW,=邞Ks 3Kl8wEK&t3&kO8K."+D&MRdN9;#VlÐچ8f ti`8 b͖9ΞQO PH\X#Ē]!b8d**T*E>~PX'>Jr+~) LSNTS7cUT~(5TsEĨЮA'yzAu OR ǥD1 ȩe?^`8RMzx=j5_ (9v"e܉mm ]om[N$+"kCZ_6mT$x~ݞwω|ލG{7.|h&ލ1{ir4e2iIGH;J3]Idgl΄`h}+= h߶qCP]jJ=N'7xN(QI1ƨc] SnTOm#SiU̿ni?êU菵ie+UV?YwdT '~<k($ڼս7yu_5V#)r!2lnmgNPEI:CR2)2u)1?urW>Kowb2m'/H|q;r6T(-~pI?MPsN֓,ƀ;R{p1aB/6ͽosWTŸ`،oN8ΨaR%r Sr/h,2˭RbvVbvAIS2&JQ6BKCDc\/5$*_nP8.U siV\ٜc/ A XP6o w5;'jݰn ГFtځ]*Pd4ۍH c40UYJs^p!F:rg-Qp3- e`0$KMB w3y.jå1"ŕ j*GpL%\{(N-P3T(N6m 3+R +\*Gp/{.S_`J wh 3t>wxP0vE+$ُvֵ霓AAF3Xb 3D,BCŴu2wZ h aFPA1%9-$w)JI[StҳyT 1^~_pʕAn^wVOֽvu[(kP©/|þz %\ " _!T3z|nn/*oΝ+1^I=&k2J-=WJ׍k Ռ֚1(EP4חj+o]4(1%ZZzvQcq f\.5~iduH@{8FкUh&Q# 1ץ4IB phQ[4(-㇋܀Ul&acARR"UqY vqyʚ?0at;NlDR4Ngf$唃R RRH"CO%ESȹ)иp/M5(t5dI3R#S) EٴzR#nlT.ӵk5%@*2B>&SFܕiLq4dD .8zKcIFx@U+ <6ijrgF>kgcàH9TvIGXhq{o{#aU#z3KV[䜡 5U ŠDw\)d8>?cd8Ngm/#Bw044)\ iA@%;ČZm'F.}XyeԘKmۂZ7]]R9C-D0+k0$C zYk{-?y[OoC.TGǴu€gЉjZPsga?TUr[klemĥ5__vm)woײϿP^WDu/Kk;5hܞ/4@uO%7XXb֍_.kenFE,0?M=N^(LF 6(R[P=;ƔP̠뼘sp#Iw1s7 ܒ7^^tC:r"}6cݒ(d4K;CԸ]9yZߧ.ԘBUrk۔sɲ(Nt" (S<:A+KMɅJk)3,F0]("ws@kPoʃ Wν59AW2zo ycݿeKGLiR&n]f}a]-W流o6M]jh%.a"  V8mE.jTĢ^k בIe*~|ƅS:sbܯqާ_?y}Λ}x%xeb4yzy͘66| NpHowOn͞p.&neC:.Z] hAj)oA2ϯ>𼘷!ݫS@˵+6Tnb]'v$v-$}&K { B@=UۚX"r_ Gw B֩aU ">ew~ⲫ9 ů[x[8]ȽS!OR "*~\rVNO.˭Nx&{a2}}ߏ$K<)zhs++7b 3HOy0͏̼qNmL\[>8O\߈_Y[kxyj{,],Kh},?pp=^Щ< ` ޔpI'㗏oUyW}s.RFys_2yօ^ 2.ȂU\LsPZƈDfDrk^"SDR5D}ch"Ma2WZWZ* .: - y(ͻN%>4D/QOr!2lnm][o#+¼20Oo89b$ Mm'"[ܺe`m]X*V9Lж*+URC*I,-1? 25 Q;OG W7_ÜTB ={޺{TI#_Op SG``MXIRw& A5 J;_S_Oݥ,~ٞLJL[iiRCuR)g8T0FLqTM piRZ]h/.>=]{u:˩rU2рS}ysVܦb[)͚)"d]!u;.Ӈnn_X;-*7??gc> @~#wrȓ8 [h?Ͷyڸ8ns&>.Yhs}sg@Qa4MTge3 ' y"#Sa>ݚ]^Pb":stn/h OLe?hvCB޸O2&oElR3x1fO(<'dTGIeYٓosb1%/ܳ\#{{Fa<̹4/̍. wm//2l )j0 WӠo3~ 8B[׀^ fO,4n(!"_7Z`!7w5XÍ'fO 0" g{R4T?IQҒ =DKnМxer;B;:KH!S7<ޭ Wagx⪳;ۣ)a쮣bDpq^4Bea`oNޣ` EsD&k )㭭Gq!/]Eȯ 6&A'=}]=owڻc6[R˛F;HDi-GZ%"LzQAp09Up(,{%KXEPlEђqJ%1A`$^Tðk:b3ScjR׉c!r*Q1PlsořEmWT'J@o_σ|8m u"zL-z!x} J.Cɨaۛl$$kƔo!w_(4Py/laN^hsb(g(U VP,r?a'@уBXWQCق26qTK%I1cCU,(R ːD9@$J*^J4ܹp"JK*p,f!h]֞Cs]56& tK5U-bC5ں>en >e\mu<=FQBI6}T l}()׎kvkf];~+H;Kd^ޘ왾AymΦc7"_n}a>Prcx  Q[:#ݗL$Ff/D9蛸 u!ޡؠk.f`@ i'|{(` \уZAWR ˢ B#;fl(SX(dU_c);\\Wqt&9xI {z j4 yiȝ%e"wܵ̾Y!wf=n|߂=3?yVBo*eDVV17H(=FH1Hޘh@W=~P⌏3'=gP[F $8M*.uBٴ6797pj@ Y[aO_{cPDV l$w|پ`X1޴.I]S611FƄ zbobEucخ{u!P~}q!Jh+'h€װ|Q̀yZ^lBRt ˪j4h (WREXZnr.ib4,5&dJz %w)…/c7xXnYp&a#Ds[Y $'VWD&TŨw@ ^ J'P|GJhhHM]?/Hqհ_?lɢ;"7k{m7i}jxޚ^j/ސJuu?cujw41$J#|es4HBU)ܕgH2nfڌJL~DsBFY_xd|yeQƺ C'~<(ăI<(ăIt647Es-<3\HkDHA:Ri)TJV@@9**kQ\G?T3dVQ]1Q~SG>]OFX.6C=<4&vp>ٓo-(ֶnA`e19qIkNb\sҌk6e3]&`FH%8.* )[LZ'pDY2;2lCL6'hP1%Tap+{.;W||͖Sٺn=zV c AI 2J RJٲB\'>@#*-aJWvŕꂜn`844LZ(U´e re(0$+hJԫBip=ݼ4W%nXU)tq}VlEw"SE.ֵ^\D \nR3C,XgIuNt\Q9/ WPOE3(杖f;̇wpC(nN޿kR٨u-Hd#^h!Pg4!U=*bJ~G@HǮ=^t$X\\)aK#NVJ+wr)@%*qHKhYjKC_{z*.\12s$~5 p5^\T9dT\6ޗ)QLUIUT\{8\7>| 9@uxєl2䆁be7} %gK"b#^AfXNB K2F*8.$g xGHޒc).tnut6sXrUIǢ>E[ :sEi[U}BH ͿTT*[IV0=6:Zv;  wM("eL(J.n@WZ9Ѭ!G\$qD}`3R9 B1#J(sLrԔ7z Ξ12Z C/} :֩ l5*к7f{}#Hfz(EqK<~j Yѥkxfv9kQFgȑT{!HcZ Gp3Jj!R΂4qݠRh!\$(I1IFWǔ*]݈zJ,}640|ЂJX[h뫮JϿN -s<_ȼNG9kB`GK%@װ$yWϨ>* CZp}v0cC6FooZŦ&}t1p;kPyt\/ jLɉJ&}/F%ڣ 0Jb.%E.JNn?a,~*Hph,uLƈ$?ѢY2A8y4|͂'|gKA>Bv C:s€}5U B   Uqxũ<NFM0iw47fn9}Fi9.d|)s%7}Qr0K+}g`/])G{[/"{l|Gf^VV1V}ټ9ALjLZ8 IfaE-v!̻8…9s& 8:+=!I$ M-i,f *CbCp%h|AB=Ղ߷8]Z6c7gqܚ?n>+T좶wû[7-$JƇn{]pz Z,{27w':F7b)OFOMB3aoE@0l;5jc|$KUuZUZTٽ3=I,jWݔ0-ɝv{}VIvݲ4U!!o\D)|')-5s޵>mkEI;'mx@ȒBN|3/@AaOѹƢ w(O:=_ SL?t"3Zh >})P%K9/!U̓|`'g۳ opZ=:5f-\}!KҞG]yQnhߣN ac(E%K 9Iּ$7@Kܞ"i䮥8C `ߪؘނSmT!!R"Jb?;5%oO*/7uLFʖ "YoulLbYp>xi- $|bs̗U{Bɕy0Kno0?!YWsV]QA EH.4"5TC (s+Mʣ+=8ơY5%q^07ӛi\5/G] ,0 ̄T %ܤIŭIAsͅ]IԞ-~mnSmEX(8{<^$D3ey&kLF]EvN(B PoS/Z&8i j뺕-^U 04;/z[X@I9YAj2M,<βm`U(X691Ȩ= e(Enq(ZU+؀gv3qdǖ K!螲`oT:.V[6,olZiPXغbe W>5Yx2+IG`EZU Yp:Ƈ޹5'ɝ'ﴁk^-<蜓;EՃC(UB nDIQJ[ o /v'vj]!c!0zuP9*@ Y8׾FR@>0V6:FDss_ 6 @0B%DR*@ET(4o7>H}-GG6;)*hKIpP/0IG }Lc" UBQX(@Xk+ edvr $ B#ܪ4Ņʽ4ܕN'׸&;|' у_<M3"і뛿{+(D IH0ς>iN3X4.Xf 3';wŖ$=ҺdGvq# qs69p 'FruXEih\a+X4ۂ?&1i9;x\wzdsA8N)\@A%] }Ʉ$`^@˗a*h\26ePn[0j*F8bk*T2T' pS*doYQRjk$?U.刋}lv_|J}e7kJGam 4 A, fb=T90YFdHTb#Mz". zڸ5 1ߢ@'^m`' Ԓܡ)Aĕ |:/^Aj䊐p1M9N xNA؅Cy Fh,)o(Г.Hz"@ a +7a !2  {1eBb4A*Q @5|A4ڋs]_ǫ.rϪ {ڰ $BƁ6jGv^z)cr0HCN02!3褈X8B3et`#"M=U \VFҁV]wõXHR^3 0947ޢ܂pTsڭB:9g%-5eΌon3SmƷ5 2a!śf0k:37Q͠;G%ʻƎA*] $yBfNwO* GWip:c6 N"JT\LV/pOPzFrvd~

q&jŧtJg_,_Cυ A.)}ZvvU0P6M:ո$XE.ucWwcH C9Z4 eJ2g\WHHC`.yai P0rS sa.=MXq-@c %&}'^m`BDpܹܴыD"R6Ê)*Iw2g*h:sa !1}KcNh:o:>U  3#RswߡyUvǼ* WYWf^Ľy_;@!@?Te7j~#[jWb"Y@b1{{hVq3Nk %hPzJ2D!c4o;!+=lT$]9fp|m&Xxb9#֬f ś'hnF:|L dvbԖ>#60ؗ׌,ؕ͵ ZZ!^xqP\y/q׈L:_ಥ뵃U?/u+<U0ofgȝeg0uKG*qppp^\~}a+E6B "CoI ܪk]'?m X:k8~qO,- uEA4N9Of0,6TZ0K8Cq/E$t jFqʔ"u|n#3ӿ`Ur,NN^.qͼUW,p8hA|zvO/>,k|vjto⯳R$y_\qqtϟL= ;V~v~O|e~|~ϫnߎ/u櫐ԟP׃tAbɷMt|v郯L\6};l%]}l=.FÇ ͯrӠωIxO[㉺Nto_}6]7NzKe>hAc۔,Wlx1woi '}|L+O+Om:Lo91OcX7y~#$g)jbty~1]6;ap>]x.D:x٧:ҞޘfFfć?]~g`p<:6DXp+Cs|0O_B#h?4>?xm<7Z^Fw_.&$, y#;}r<2d}Tgl8^*1| ydrar'(NI>1"K}fEv͗:RwkXCG+&v?~njxya/}~_fcHE|ؿTF ouHlg2:vnI~j%ߐ"gҁLǚ%Y=Vj׿ٻ8nv7ZA"U~spUfW*~&^[ʇfF3=8mI=M@Pfa4c~{))U}1e~l MX7/M3zTP>.ՒM%8y濞ݼ}s}-_ZSKx!y}|ɢ_>y׉TZ髽%ShfPXolJ)Ht 6(:UlX8R}ow$k )o76aj}ٵ|*_~}`ϯQplr* [,˖>zlw7)gMEۧ+Y퟿ .*ӗ^xxN+/}eK1_<4nJz_ʊz[b ]hrڲ~vZyp" v;jֿZiu,=zVpay-bwy-{JE Ԁ*dO6dt!( F^$V),ْAIA8T@$q2Bhm5י#&kwQ8|5sQ4)Yj`֪S,5k|$4% cL" &Zp)Vby$`YTs1EܐT (r1l8i hbEaF[/`cV0-n$h"K" F*jBy`;=Sq;\ 7eDcPM(&[X *,Dn}M cRa=kP %oI}9+grX#RY$0JKMnz75 ?LV쏾dZ`s :-^Y> ׭ Z*$u lIndu12nyZ{e4mNVse0@P`clZ!g4U:,z `c/`'HT+_OFCt>*$P'w+ZM>,| (^`5h'G_cAC5r2aRDiΡ]nt'|yLQu]sN!)'Uhm7Q"}K+w %^M K!Fi5Wㄗbڸ"~"}^ėMl'r{yQFǘ:*hZex̟9?h51Ab0Z5O 3" d`p{rc8vX7 {1'71#``6[ޕ  ؊w0OLxrw|yNs,2ON4p:==w@ǤpClhdxךǑg7drƘ<=g۵.Ca^՟_MQ'?rĺBA޿78( ,JvG=՛rH\wG Ih։G|o+X`&T4[%*P'}0p5զ]&zb'B5C q `pr՘PE[r`=#_8N.{MN|e'KEVQ(nbZ[pkF'B˺3IXsvVRT 8dSd2&kpG ){j Z^5"~Qh>Ԭ߬\e<0d& .D?V_5sZ7oq[m'_Y+tjv+87}$቗.l!uo6 gMC6~;z)*ʃ>*nUucZU2z\9IJ/Y;πMh|F럺)P7oEлǏ+; A~rck"NEZ9ryUBܰ"/ݷE|ś7߿oA2#|'Ci FlWh1Mqu NmysTrՒ|kc 3l 48iO7czR1N]I>ClA!f,x9/zѦCl{y#; ="S8K.Etjw%zb}xmq>,zk/wY vyʃ hW||SMI!?8x79ޖ-BN0f|L}*c"*ǞZ8>c7 k̦1YBGNIVI GKO,/?)d{d㬆9|9ы5k8yF4ޟ !ۆ luSqlb=jRI8Sk'Bl<tl)m঄zgupd3[/7LB݀?Z4#LwÉEΪ)V]!Vc#7ZKo [$]\/@ú_9Nwm} l+˶/+>ZlK)Kb- R 3jan`% %'zpጲt.^S0R 1@IA"=4keM#d.Tٶ1٢e=ek`e*"ךss:ޖ[s~hC` i5[eL+_jQN=tl{KIԖʻwz]{B۳:stزyyۺ coHb 99e1K93u>J[";e_2lrga8BjK ifj ;> z==f_ַ[;!8vd15nvi< -ֵ\!?9q>ţ!a:jڢ3d/έ7>#1|ol宅 Wҽ;lx BX ͏BƫC 5&t ջFb?>h)/̿TyDetцzW}4uw:e2Gi9جdX+YM,Ei5*ё jץ2mm/?W~&:rwN O@5{AL;G8O)=AB8xH!lu&Yp:ʹ@,Xvqڃ8b.g477ql!DD[pQX3ΰ<` wgiwZv+2WS6Ĝ[ٮѡy|a Y0+2~\m b[3oUJc'Nt՟Tݚ1NR}n=Wg)̢Xʸ;O'y?Onfݒq'D 4{12V3yudN5zGv|qD\8.~v+?i 9щkmW卣n\i*ygoԞ4y'<#dBkp~yH8IuRNq"C1~Iȳ9pn¤4h"k }*U"YKgGfGN&#Cp&"x'\QY bvJ\Fq`R"Qx꣣(I5:ڱZcSř)0ڜ=;N5ڹݱ&W@~%.h] @vxolK{m#,}}z3crNkGx'8 JKv㹀{a]oV<Q_yOy% pR5L6h2|-{]!Jr2C9qNlfQkȷw]!m>캘)p`w.R|aં!`rE`1E6v.NAM19II[H_{al& O:КJ {A50=T1IP{ өYd\AIe8ߘ Wf7Q3jx/`7!AT!PRFrp4 =Z QUSi[[Pi#m3ݥTYR-ZQ4J0Xp8JQц8~-8q\l]USΔ4J|v=?ʥЀOBA0ۇfEhË8F1s5llfSfOutuUMl}fw`P06nj"ƃ;w? BQh_GjWu2n oJ&ωorm(̋8Lͥy̦$O HV\v[JJiۄ讦 {5:E%?ͤcX8\!1g%/>H8E8B\3 5j).i rRLT.d^虑[Ac |rXLy&Oh+b=Aպ0R!O[!pSi J%a ؓTl /~f˸>/ʏs?]wKճuxbgwo.ߏ7ۇT,ݨ>9J Kkpu<[%& mIV^.=};^| >ˮNO^vm k[?(ǒ0iy0+Oǐ+JWo~<ߗMf~I>Wt ǫ^@H֪/+:!pT/=PO -P~ `'u{ޱ$<H %Ec= X?=i5ڽ9z~ O&iR]I2xI=R."h '9Z;3LQ vvZ3!BDv{mghʥnI&{!M=70)t刡%#owF!!n|v*S% Ju{%_SЪ xƺ續>m@zK>2hN6u9ikuιN4IJJDQðDBIp*5/C)z#*XdsD.nzXނ@gvvƬaʓ+" `G{T [־*rY2--AMHU2uD(bbDxƬ1fiu@7?X^-zi z{Ml0O'7?Gy^ ic0Įfzy;W_Fʓhw.J 3Ixͫu¹ZjcތQ%3%1{>yNQf(XXDpt1`IO1”gTbiwc }Ihƞ а8vTmK{0 ^jih Y $,ѽ b4 $7ਢ5w94w+;mx!zJn$' '(ۙmZ574&NS~^:mx&Ќ?H 76nLYIOzfϟzH^1,Lh¤'BE/u5Z}=O/t=<޽iVNtÝd.Gp?Gן`@)JJ)?iis4ʋwNoIhy)Cz.]:j:,[^Tڗ^g#5ynG {Gv@L|Ԥڠskp#$[W۰n_ilv"8'=vy:DǞФ $n#) =%0ooDsD<͍˝&?fGQYG#to;󜛇 "rcx(82H/V̗!i{ , P:o]̦.8%m4|J_b)(J2TXfjQ/uçBu*'+ц3?  &Jq޽h,I3A~TyΥ.GBTvb2 UPySD3*uiw$@.wCy19ϳ󤅞Wyy4s PBz], hվ^I1^}L@J6纨&,cOlQt@g⓽}2PXG %+We> >FRAR\; U3.('9)Vŀ!/["E1#^QvZT&i3[ q?QhT3b(nw4<N(>XAa$V0))JmJ\0K ^H5Co= XG}Cb TZeHRnZ ar} }O 1A [,WBp&4O޿{~Dr:[U Ʋ[.1^xD@15 2lyV'0x j'.1aRv΂R, g dęwS]-ꞾjT#9w-JvڅSV7nPw2k[Gb]Zgⷿgا!ISې{;LK#. cj' bt̬F[e?uk:yghCߋ|rENԣ#PӞkn@,q,NQv;]b[Hr6DA'ÍRJz~!eiJ"HJϕ!\2Sd. AR+-NH dnHeSE>^ęɍ<5#R\PDk{adӕ<\ߙL>0݃~9id4,"MƬհB]LbY[Qb]dAf(LJVY,es dqClnJC8M`"LG53LD&gD.pʘX뒮uI9zjau ݷ`jqH첈K&Ү[CWmxDE9ƨ.IH,w+JR]Qn,ƣy?XxVry+;~ZOԙ"Z>*A'%dR;-")r?Z嬆c [@pkKa"meǪ=ڎQ)9GJi}6UGpOkS19&@sG̉ip^9M۩Ī=N%ʠe%6Ҡhh'bs$sjR}4Vyib+E]X=ۥęs?yq7<'ԣ"Q?j09MǶF>]fяl6Xg!`FS?OWj}x3O\^b"/(7߇on28k4nf9a-D/^e\Fޥ aA^kq)-fPOVKy^ 7!*c!BA+7 SkO~p Xqfk慠HzU(u(|y-B`X/i3xh5h]o2D5RhO +jT ڨsKxѢ LR̤Gc3c#3vnȽa=)g=Y<7-fųg s UwɚwmHr7mE`>mr/7d-{ld1ؒ-ld[j`rwydX|%f vdJ:l*$cbF{#k bDk|܎1ڎDȏ!(H5CT#j]x0O{yPև1Rځ;] ٯΗo7_#CP58Lo$@"=M W}3RNl N}lw` ) kQ@.ண`_LmU@0To-ǖOmZ:N)Cp T@Cs^ ABPlS%!yA 09TH+gѓC@CQ9AKp ww#6EC{'S/Ԅ@KM[hlHнRXI\+SP(pnHr GG@k0T:4Qi [ŝ2l:Jj]zjT y@- +FHcmjoڈ6-3TDSzdUS=4XpGEQ-ázK3"6`TH=LG؄B9/ #쒴zMؒKv^iq)d&]W-NX!AK_w4RQGv8nxUmd;L+`Èf"R#if~mU޿%K#ΗE);h(r&|8lᣁH{eń\(!_jeWk? W;ҋ7ޛtmƛq:NdRIvRAHR;GM1Sϟ])naZkԺR D})+wZ(4#pӵT{;$wj\U9,9fWDT0򎧃mNZ; y'HCy00c./%"dW%xQt~!I2: cNիG ;F7J1X=r {&F-ƍJBPCҋ㌹vd%xHiJ Yᔫ2S4'HTi-A^lw8ogT,#ˇQC[1xO6u!^?A$ gm[Qmq|f`%MHI5ImǤq( 8 l2~Ƹ2lHNGLA1 n*ul=1KvƤTFbOOW6wV|d-LDaVTs"OJSW8DJ uB+[׾V>MmMW>ocQ?)r!yz}1EPRRFYw:S"QKzq5CG^r 3[UK왆G<TF=f_Gfc4v{};=+waמ/, uWesW7v\ @s7f1zpEW>- 5uxeG*Z=cBd8ً\m0LDD_vi)1ZR籓@J@c sCC-8vxժUs^.M #ړMf80ުeU!/8^:Փz"2?/>|jP[Жw . ,n>K9f0}S;oHKR!XLaSЌ3To*_6WyВan[خ5^`$!GJwXYLv0;>DK#c,w p+.S"?f2"8Ҹ_Foh=A;\GLW[EPqoRe~$YFx#'hWײ}c (HD*%*X/?vA7 w`6%(]OQPHl]6PUU9|ݷ2iƽ7d@Jy.y7\kDRxCJzn̴l rG^M<`tXMh7Ps M@ ~3cDH<^q6o*fw'yuwb\Zo|1$ \Ս3feJ9(,B >7Z(!n+Hr C^Ac8’zz+ZÐ߫q[д{' cvPD$< veGc /Hu/ԫ+#PxOu Z V Ti~~疞|<܂8\F"84IR]L4&8BpWRU ^)Tڋ?Scc>y\~kMn'og75QImgP$E|O (J&v!)Zĵcх /=:X)Sh<ҟ8o?I!?~R<]I:öl671 ڟM4"@ ֋H4B] O"EO ~LzpNB:p )ޗ\6\k GMqYiSci/cy8EF$DF޽f4,e+Jc.+'!pAzY< 6Yqs3wg_p ~ij>Yg]%=Ii7r{&r,ĖE?/K8bFv⯇)ԤZ竷sՖi~ך+0yYs =5i=n?-a[Xa:`bGvկ?ߦ?G 1_@ mcAquEƐƺVQJ lR\$5. Gߙ9sYr|sfQPjqaIHLk)DHa+0Om8?7 3DJB.9 gy_j;ηVY=?"玈2 t! 4ث(SV0!W5 =Jq`Y>6UΝ<9'79yY'<[X 2W5`}qFȶV,wnk5k|{kos1:6|vrm{욿|H |ē,7O ;Ik[k{ɷFDJK/ ;Զ;8L4s'+NZ҉ҀOf}p`N26C }-(f*q`$BZD2 !)x@@ĉ8R%Q~Kfj5 ?%5H.!YDbˍ5,J">.2=iԁqq8Ν ,|a4/_;.4#P)C%g71i& ,( 6,(/\5YӗB>9>ALGOd'Vf|C̘]XetZf*P )" 5'Jw4tb|Ct1%$˦^DAh!v%8K曛n:R*iuhudf&7{qwܩ-G)rgT3; A ݍc?U3\N#?O~ 8f%Gg]XG.C_VmnnMu'UѪiA\>A\yUS2VQq7w X9xRyZC] !uCPqmZ2ӏ)Ra' ԑ1"'XN^ "XĜ[a]9-w$#oʟnQgqhM0r]Sv~>;:miFgZ]ڳ8=kitGg7Jf [aw\4k+x|6Nwf8?#`g2'DUxhrBdQrb VMxeUH1_D/9 "FF )`z_( |,F>(((#ᕅ LRXjxei.ƺœͯb{f(ĄETL2Hm,hV^ޤt<K#P~t>":"4P1FaI ?< $p?hE6 _dyA0`s /,[ KIr!(JGw>0%HT,? MkMcl˪r0#RmϪiɪ&(C1sYuŒ [ yĠq|rl˜dsa'''L0Xg6Eԇu% Izd $ g(=b6X@+~YAo4ۭ˶S 71LԷ] i&Ptں}`QRNNfK>kr`O@c>@Vay ShTI\ `.77;BC[d-r%Ƣrj3s|{y®*B #A~  /t$ LN b2R»)R(|z >|Bsٽ?K Ÿ@G~2Ua-9m نD[ᰢXl)@HLkpX UeM$|Z0KD\ &tjd(S0uĸ8pRl%&f: 24s#L--*ÚliLW E8\:rQndCC1aU"HP jv1SE-aP C(./іmYrBwyHPs6Fs\= @ʵBINUعWOzw:@ 0s?>]ǵKUo9NrU}NDP>:RLLɗ+]s):©#ꪩh烍+!ʉYkՕKJHpEغC12%yQ;QRLvä-fVc'*]TUe1Tȕeu7z1pJ&Qx]LA9L /# B0씁 Tpc`\=}。cT@97lrȽr~ 1ol JNq:sz8p!^kZ,9SN0d;6U iN )5Lnngr"ڮ%q")ۚO6M M>L{`sms0[bT nēvߧ!I1 LS@ `cYțU{CU`!%e%D)XD`*3.ӁrJcAy ̏|r!}3H(ܛAI&(.oKɷZ,aa:ců;&xv5_4wz_z.qovq~;8{=O _>|'32'A?^]wo^%Q޼I?~b%X^~|w/o_rw_n<3_?OtwUߝ,{Ս7_L;\ޓƀu]/Io$&nxp1OYEo^t]'OϚ#)9\H:; .AJ Jhau ,7ҁ腠&lIi'a Z0Z!%s'fϹ<<<2] VPNlĝ(rwT1H>{yՅb!q;H45%>;o$]p:z/@뀣;yNUo.|6#w{IVg F>޼ގo= u-J8' ߹QOl]z?ϿA.Lx8 ɷaT@G`쎠;Gq8^L/[f߽e dLmmҶ)ЮwᘦX{/!|q2zCz'Ƭ]ґYd,C}w0nGzP'O-R0yA[m+00A\*.YBDjN-#ڜߺ|kNXC;1?Wv*B*0*>s[s*){6r wKKM+a\]hfUA$30j'moG&prL845G[3sta~o' v-(Jjչ7w6J-tyeaʹ7dGc=ƚ@F'< ʋ >&lll}S5f^Lj$dc{ 3~kKRTڞ4қf_f9|rŧVc%س;yl8>&Ըx/Z]od%SfYZ4>69v~JBh ɲlvZ)$l]:Pi@(D1͹ͭG>l4m[hhxRJ箎 ꉒi0mE(AN"Z ;Y(cpR褎n{ T&8#yQ FQkr?|p`YGZ_L}ē J5 T+&U^AѣZaRuo\׊“EFxl[?eԖpҚ?dA9%;IW(ʜA62'FI5ZZ#e,*EQ{+e⢞ f$c$@h꒯F䷟d A0$ivViJ)WSMe_/V ^TVjZR멚v>ْzb06Wwk;- dҿ$L)Tx V&X_~kS/W!g/o_|uþM\9a*|֏3"MÆE`baI=q_B#4W<ҢLi6Z?lK6WwmmJ=52W!e˞J* GҮ8C'? ]* ḤcKQ-*kP/(r̬ͅxI,`:(=H)Q[]}P' _aML{MFH=5zd^IƬN_(E\'Vp]DHftQe4P3J@yͷjAh@> 2zw>|uFYtuFVEg/zxHSn@˂w!ʐH8v[؃T%dI0VkyLs2fEMV6Y"mfobbFM zUUcޅx:0FʩF-zKϯ>$?9j@-8<ƫ x[}̔;.?VL_·y)-F[Q5U\f:*vqz/}ܔO2wXξ`:1O};Ƿiϛt~ 7 >$a0d6%#+6: M8AUmR …7͸`!}ʿ ުRʻ4d3%e'QߋĔJ]x!p4&*sA93#uҎ4ֺS^WNK[bw\:o"-/ҵv5** $HDža*֞zd$K M49veJbyiEip0{aym?87f.F0x /H 2xO̿1V&ъwnr_,>HG5]Hҋː22vѾ;{xpK~Lh)du hr Yr/{Ty!ʱHQج:l8wAe'hݐg'gq6Bb r! Tz/WS#c:]&8F-1)Ra&@`*1SfwxkTWT\utwfbXG]bgh˗泥1ִsyhR%qnb+6L! VFt᣻D+4+NX<L9aXpP9E2>I+%I̤*]wϼ>[o5Gq|nE5}Q~b>D)읣鯡Yr7~*)_r{$b %oIg*l馬΄Ԩ\11P:S,dɞ.bݐт7X` =٪bfjWRFKދɈ$=֕Z A'\!{Â;X[ E{c>p/%FTEB%2K X*)͵P J5r0P-cDn C*Ts^ӺHe'v4;]0xbz?5E>93tgRtz\~`9uϗynbceDtw$251ϕ .]Ū Yw T Wq#_=OPaS*/ -p-,j;u/;m;*N6ס{l]5'<>߅ vF %/H4nQ[ ;~P :m:RX[[|HectiLg $\( aIr2IR&U$*g*0Jx0x Y08ay"pHqw,/jGmɱ%fETҕU^ .'iG)Oss5nUfTuFE6*RQrQ|Ms'9>w7n>l74Sp5 6⊔R` N2NhZ$v[x !p(p#$beA3 CLpB"Z,-v+5MܐH&iespǵ {'Ȼ7OؾY:d]6fr{4|^}yBWr[4|^}yBBt} 4l/]uwzCpi_%+*YWʿ*[eC CˀT@H}$ϕhl( V kJ:?M_~8H҅M94U=VK]pߎ'z.5ƅL0U+i#b`*a.F **G{ ȥ2|x@ ؎Yof?͂YX*8W}TOmsCc'1ι& %=۽tPF-Zo 2p(Rki6mNSm>b"vHG,lk`H>"2^bI 7O1zy`'؃F +5np O[А lZU9RF &_* qz?cs=FKt)~ ܍8Nd'.Y+{jHkR,l,49"6 (gq[z|q)ʥ3qm#``pN3sTrrEDD6D Ki CҪTeyT{%gh-gk3$Ҝ^J W':qPXKxŎdv((V'QqFiB=\ \F޻Vr_ofET1d~Ҝҁs^5AT"A;DйLtC熈437IoK~J. t\m VVzQ6ϟ(^ոxn1հ]Q6QT|f:hO뭦C},ΑhB+2, k&ɻG!Bq.M]޿%&֋jGf#/#vg 'R_<<!1VG #i`Gf=l1'gxɐz-22S!`gj iJ/iYiLW' h lYM瞟6}ɳ3uqݸw:LJת!C~Mݰ˯ŒXpʯUjPԅدrvA !Tz~-YqC5T6F-ݵv; bq+Nkdy3JZ խ0SXO˜|j|6m#E)8p佺 7χ] 0=S]2Z2.R2W;Xy&eF#Y&e8Zt8"Z57ԜsܒY}T7I(u<n 5Dw#fV:FEtU`6MpaHx4X|ҵo@\9ؙ ʹjtu׮IyAHrxjrywTPNxۯ>L;Ο(zV`t)HnYmIb}^]^o4nkKK[ ,I#1厭pIK.(LdR¦L.%_XN5o1O}0-RWy?R̋JVrG *&-bt(yN Zә]>7xǹ#Txʜ(ձg;[ ԾZy2ޅ]˾j[I/g!Y,^bf` ,`4/="ۓ߷(rKjIlERX#BNK)(`zYoe NT8-JR0z cօҕ[g4i|+R'K e΄&ڌ5OrvH+U\{h Ԓ6Q` ~ .J8h[(yVK&2v ƌ;ue6&NI>|a,o3%Ӑβ)ǰ&^į(#Bi50NrŁZbU3b6ZF 8Ukydu)Yr *Ph%xbA`ѻ2P]oD#`"T9q/[c㙍mqeA1 uNcneS"aE:Jh$#EF@e ݴ^) ,=d Q=WȠōSekJ>Æ9+r.fvn0gHyn)ۮ.꼤~Ga {sZ҄Q}4>.$ΟO+,==;G,G~dW]r!zyi4=)x9sۮ `E)ddaHqMw_^\V|gl\/ Abdpa8b$ghx`|&G>/៯"b0b_7{Tj:[f-uWNb蜣/VSl`-ETk\nt y.)W'w=kkgRZc?G'!R("%5TR_ەЯ#8[ ngmP&[oGaAQЦXuoP4,ưsýfP+:s&nyb[toET(zPC 6q ,M[{K ,EЛ䚋5 .fSOƱHrW̧RM.k5'_k>NGge*T^Mnomv+)Q@?KpHu#_ғ.SzeJOLIzHf-c04vҒ{/&LD˃ID/o)a)#C"O?g՗(PI$&,(yfBZh&Y{]nVJqe@,8Rԕ(*" 5 ̉2 +Ej>.ZH8D9/ ]d=d#;=X9ʜF_' un[ɪ`OFdFrq0Gĉ0Fo/ڒcz]<"⁾KVF<,١ oJ ^JC0EBg *Jpm* F r]=a"p2lڳKd(SJ@TVe1I< Ym99=F3 WXd-BCV,ιsi<)E,:6f=Bf֦/ЗJG > Yl:x(8ܰ,~<).&C-s4.:>.h'E#BݲƺG H-{ A:)R(ǣP@mAJiæLP Be-J[?;Ɗ^I0MZX*q5]i͒u @35V%2L(x92?.'eZ~ C|#-giI-^N'PK#㇗k,ۺޭii @6]g/eSY^ BCB)-׃?FQ?ZY+EgO; @k \ߜ޼RZ\(riťFsk֯9 snʕQB찙VX-np3x`L<+ !,&).$i!Fjh֛' .mUq~2_=ܐBC9fRAf6)9SE͕MMQXvnM GT\>4x(0Sn)PB ͏w LN3 ٕY(CץaBD%h+ULXI^iˡ/oUF^dV P7|e4.RcriPRJcti!8.+vA<1/Q9Mj@{7e5"x_݆8oK ELr,d%Caiku(jc(E &g31V2 k DԐq];B8`ѪirfP M@v ]+h_5ZqlMNvwM(dM0p'`ӠҜԉ^\T^BXΥѹ\ ֚V%cZ3%`4m 9MxRRh?^mr -Ѧ2Gxd2[cKƒ p&)zC%q to!?X*lGގk/A^ )+ ~|IW -RrktRуHZ5 ѯI,JnԊbvK|Oy(EÆ"g2.WNܕ/on˅lPL>dH +_tv"`&+.LmS8G.=Gm&(Y,ɲ)B gD:B@&ojM0!<ה1Z)98X3mapXB۴z)iHF6T&aZ7"z<݆)ŹXtoMՊz7\$Vv9qYZzxXNŦd+u߯߅u@9b'-tbgYuH[+9edŁYC8-!u-~ !2AIIaM!k؞oԥ$佮Յ=.S^ {) {$Rr"m>.r.Ͼ1W[j >/Y&fI1 [yUx(.9R[/OUqwy_Vb*Ha E?ʈ B&?@`RULqiRh%8׀ aaI@P@&dp !'D'xF|e'mGѡ3xsyxD;rv._jk`X_}ٚo(?Мͻ#baNwlpMC釣5}&z/=pGRm|>:C/Y%2(i]ۆ`ƨ =(*(;2p^IfLS4D㙼=F`,?xnuyi+502IpX ǜ~{rw5y̎$L1G6$&l2 Jp%yT%7AV މJkˀ.J":&u&/]4(*$ %bWyeH p.>18K9M_׼唺oLN1eA)^ZfHN:g?MkzU.,ȶuˬ6(q uEf Jt `X)eU֔ \DQk(sS͠z33;\J\(Xpe5ow_0U@,CV7qYNzW%OoϏuH$f!=5iYvꂼHu^(7fv,0GAϟ# /{|UZ,=,썖]Np;wx$䏅c(xye:{O|#.8Uw_nu&)} ҉Y$id{VMn\d)^%>|Y"mqXH_ܞqauJޥRT<Bsn{דzxLX֥BR#"2(ǚ}иf8Yte~粨* nnd.q"Yų1-f-q5AuǙ E۬?^GWښgivV;nTd0ϟ=3*{mT2ik8'p Du3mi)pjP*`j¦C :Tɽ$#26ʾZh0ݷU)-c8%5Xo„R\UIјI7jHS^\E\C\IFă*PJsb*hsն̤TRpԉRQ {dC؟`9h~0F%M2r^{Wt;yxЈ7>mr3ڵ]GWIZ*= rmY^)CBTtFEߔ#E,VYS9:f&;{Gvފkƻw"α M ާJ2du3'/J]9Yb|P2 'Dw2 PE}LZ 7F]\͙e*!7>XK5.eMs] >Qn݋$aŴR4:Wd  Ɖ @`7 Ц>d]g_ ]O?oc~4W*Q4j:X#!W*϶*[׌wEk$m:2RӭY]t>R3r%۰MmMBB{zyxS^.>$3eG2> 1K";\Su+Ug'kpˉI}P[<~1[-Ng 犡NMVdG[w47ж%i +Uooepc{kb# u!0p< ĝ);`^rDpܵ3"dcE gQM2ջ$($Z*_F{ p }Vm &7|Q'OcߙG+Q(:-h+qM NgW8'ƛEA91F1׋ߝGC!ygg~L_,CQtfY-ɑKh1KuyDS7\RPKNW$ϻ $x)ש |3|/_o=|:Y4`~ތcQM,(~=WĚ`r(syN)҂0a|n8s䙦F"=p?OQ=7C@T̓ 4`Bߌ ļ>A f(&yC )u·6ar7c7^:SԀa  tRgyyn=vaƖcaA9|0*3U#T1FqG\G!-('K>Mڄv^?ERX; ?nh9neG@õ}Ӵھk(IykV{F[qjք8RɺU'(QϘԻBƒ}c4o&kjEw(Phsc)QǾ' /j[(Znep1.u@f] |d=f(UmS.x)k /ƹw9,d,x!zFYGzkA1&w0 I(?D5PAjZ-l Dx2UAc(EeUom vGN+1m,^7p =\ ˧KIW/;nK/' uЋ?>Y%UC׻/V!A^M\#&v(ܛx_edF-"&XZmTh1Ba1fXxީ@^=9 O޸g\G[jMDYx [8)R(]n@V*K³^UIG )g':C'jǓ.-ם[/AX,1WҦ?y41$nQF΅2AQEFOpхT5&êb+Ӝ!*') \cc5 Fy,QKcKcW tOgq=ƴ*VjYkB0M7郺m'm1Gp;Gp-Bu }lx:0b{`gu@3f))^|`)Zt9X*!V<BRE˽]X3!;LF+Y! %S%>)Ε!`<}ʃ H\.&Mɹ+ʎT?x+˧\k>X2FMUc!h>^/͈ =T y ZmCY#$ m}i'2>T*Z'󖊯Q V$qUyO_fMz3 mQCZa|6}Ͻ5ß)Qw1Z_؉87xW(˄ 7Rه]׍wnϑ A0k7BJ.ëD68e@;c)Yqщ3/)pX5];gB@];BkbWV+ࠀn 3QLqcK_} VIޑ/sSrL甊~Q/R~qUYW,`BPFEW BRŚεm$JJpɩV9S3syy0eh M7_DOW:{XrakÜ06x@ cI\1[]Ο]\=i͑\)T{[>mJ;Eՙ͌*ϥrQ1Y amNb\S%LB 5`B7U0ZPdE[PdEVlAfզ e\k+'s؈R'F9 Reb[N; Þ0kfU{mXHnQuJm=6?D Db5ьr,`]ٗɕTNˉEu9G i:"چﯼnn?헽,].>WWcj9IZ=v+v"Dʖ~fltC{QlO5F=ը؞j .,04` <#$ U+a.9ؕ0f+~R)&[:.#]Budk@Fdk@Fvbqn "6 x9&GA*0|Ҁ봀Ye.YTI~4֚c"W7~6_V]v쭧 쁆` 1a1XU@R\v.X51{C(D jO ʙQiPz0υ3Y-W ~iNhw|u:t;X0x/PxϮ}z"U/],yH<[m(3KH,- XYPeܸ\! >ŋ%EìtSY`n8y9p~r*2JLOq*Lnf@LRq883; -%*^@p;xSzaRz!s7w~ !Aĸ< jK7bp9 L."8Ie2 o UOU/_m Cm+Ӣ4?'LEO!c-)C f@Z\ QK|0f"u9d Vzf B`W"̸2}J(O)f\t2*x*4{TrIKPMll.,YL4g1ќDsVL4'Es)M!T[…`A%x`nQb+))Dӆjf |dT4<<˞ PRЫWAO DjDo%%VJYA Z9:Q$NQp-6p 02 KJ^*v\e<1-28s̒fW #  &`DF %{F$V0ޠJ3MI]a`!l[t3d"fB\?`jQS_(y`vshumRgSTJwJJEcW%0- -ctr%Tcq]U`=8yێ+qH/>fƆe{^l4PsM6xHYddbVz dő@4 Oyz K֭8%K}c%. _OLގ?;X3?Y vU/Kk._w&g? 3-[ UH#ۻeoonFЅX:/Ku~7|(]/OX=GryZ.TNwߧp+U8!̼i=M̼y4cƼekߟ]/>=4FtBmÓSǷ ХC"$@Q5!HjI 6n.,9}뙁ͤd3.dKJ^TɪUmf:fЩ) 3f]ّ\Kqtq_]ʓW_g[8\( L:N5Ď.`Hj'NwPIq*YŎ VVl ?Bwşo*'fp!")kݰBBpaM[>Qܿ$ݿ\d*AAֺcE,2Ǿ wfd6j KC/OٛHeرAKAV? X%}w{IG`כlm^-IELޠIr)WyDDA"sn BDA !0la@ÕWV^ '"#ҖC@e5IFe3(6L<A궲p5.1םM Ҽ\s yp{#ٱMBT㲼pT]ZJ1S[ >;[:y~baL>hH2p6LeR[eIzr \ԄfB캁ZG [-pC v$>! \rD-A* uk-3+㝭`4:1)5MQB6:F~G%bW.;T%ݭ c4zuC;2}dA:SB4M{.z"DByDc,jPQ# t{2@uB; Y i/%jP=V4ZB^r rƔj 5Au/aXEkZ)a5A!#R&2*Gq}3Urx0lznߪ3Ž۷L^bo:mVlD*ٹuz:3BWIfTS8V AS%^dѥYw ,;eg#:kt*&Zd梳DJrvxɉmŎM7"o6QƏ;_w4 Q dtj?aoIEt؈@jf)@ *2ao#{?t:0z,9+ݒ*Bsp*__SP奴{+r9XzoobyGKv HQžm577OfO|8ESZ{Z4ՑQbmWW&ؾ 29x#@Txe_d>Mx?lQczzrP?T#5-w+qy/'lCfџ{z˳{^QPFtY l8Eɟѯ? qtJY%LS9 -f:Ş~Fqzt6~HV_="GdyCNțA~ǧLd1uA0! J6e/rCf`3^ ;xuOv39W7+umMfȮ,ԫ/یvĈWuyPzJO _ޅ5 ~ e&l]^FL|a«Ӏ[[^Cj5UT k(|$7$B涆Op̢dW86\JԹ[eI=u1kCl9 [z홫W4%C+axѓmpEi-w!gIhd\<۴_4ޞ}tHT0.iy>Sp?yɘ*54CK%cͬX2S-3uY2_r'M@X ٕ' 3VnNU?}r.V+Tfkd'g1i1V9_+֓ Te-I7["ťв{Nv@{/ooyѓ Nm`X^ W"^D#n<r:#8z$E` v! 6[*᭴oFhNrv3֗LeGkv۫e^ȋq:SQΩ H>N㊈rr(F 6 `ӥѹD%iaJ[<ӎ P0 [И׹0=Li+Z6٪ !2(P qkO(DWȁ <\X[gP]>:-xk,;CgcaahnR%=۹Rҥx6[DmpSI``ggrw71NȘWt3عX4yyȗrY٧ղEV,k|Ҷyfђ QvbEu*y7HW=+$P)IW]׆1 SwG3ocE廓&w]PT7GaEQA Iz?yJz;ms N5@S$MќC$|P9"8,Xy`*e @5#d(%g" DIȂ c]LLlxkdϾ=s~jx>th<%Tak2ޔ+? fwoIL+>H}ǂ88Ǐ|F`9Ơu&Oυ,D)Bڑ\DdJwť,3ZT N;X9Sɬ[DZ6$+$*uҸjۺ%gnw4n #OKE][5[hLi}0[֍˰nw4nT<-^ں%OukCBr=Z /d{tKFlb.j-1J,wjOgn8{ <{*i17_|.x('"7tXW~d:[n{0;Uj+X:{|WL(+J>17Lg.{O5g *Ŷ}XlDhcʼnUDK!S}T,&dfT`"!u+{oG)ƟVsY )YJ(*@c,oٷ++4SG E+⌵8CXWA '࿟Kq}<.ǥ{bHzovAqDz%RjD.TVfy_`}'˜h.[:F e3r=M?p/?Zl՝9#/@@r;oA咼%cCsz!o(Кd>.fFai;#뜚quTdqz@vwB ገhDɉl5CO˫FO9K`a1 D[gYsp#휵0_&- hBiלq2A[f5QO="eK'F4SftUG*SwfAeudSz~dr|(oqgzƑ_ir,^9X`<` ޒ۴3Ovwd˲)]/4U,W3{O/G>xBFD&u0Qh+R.'::Rݥ'SܜRERpD*ƼSv5AGkh]uI3-y%oxK-QQxIBHhЂ4iTt@p@hDN]Z҆UHi%⥃ґ9Z%-2,ra|tN!DSn;A|ZdM )7S/+!ոctܯWж`>u)K)G'1aAkdIuEכn{>/>17{AߢPr%r, Aqzrw^'__cxj{+2}{" a"VhK/ nȳ6 =^s:KY>2\ke:A d;`kIX 7t@2R7gOC1J Ty(Rڕ߫FKD1ly$딙[}kQJڝW v:gP7nyVhfvO2L\ {fлduK41A^dQӕyo˼[oZD6E-y_{F5 {I~LaJ4݇ƭ}_ɧ>!M7T fȖilyE<"3DAKT][qs%J_o 9*CmF&+ Qy`>*JYThDz)ƓtəY#'$2[VJ⹣>"8.aV3BQ%*:jfNѠHlc :(jvE; @q/쵎6gAu$X% l !%A!S = ZRǍ^jɳ\,%5ʒt<5JQ@ld$F)5)HV2--U * $KF Ssf g[HP:\%zD<ÕGQ|A9RbDNEsA0Bۮ$B+1%n!vOjRH6]՜lF3M$Ik8IAz? L2D$ 3ӟFG;&5oP LP5NvSDAu1oSYmF=e܇秇POX`7^\h䦸W=V3:*iojR# (FQJ0V{wI"dTU"Ct$91jXhi|٦=G[Od)ѵlrh47qTW/kR'^_ W|W5]"ؔdA|ov0%ζ?FJK夷N|~W>nǗk$7W}u40S! 5JȟݷeW@TaZ_f94A:0:$>W5"k] m|Sj5hhGl(^<Gl+E"d/+MCRzh+e\p=]{{eSr% 5@Ա Μ/VptV^{.f߯Amhkԉdb:\"eZ}l{PtE'tg4` &PӋbS%:et?|Ft}jDN#7xy;R~za\cս+b_0έ͸"N 6W#}ٖgf,ك)I ;4p`:PT-27<3۞ 02 xiif/Nev662{kܚu0~!*Ԩ!ʛ$Jэt(YPBР %]D MVsRAJ&f"L8E$R{,a4ҥ#TEx2rb8OZH난Tga=Jz Zu!  nZ9I@{:{--sm`yE1~egdYdaj$l'jut QU6L@`LKc)!Ŝ-x=%Q;15ov[؇f|wx7GiW?=,*< 2:2 ǵng$S`[H>kx$1:8 ޷.狩 ،=0k|2휼!LjFpC̖&Zv0РUW;&Wi tU;U֭'W~1rUH(L^INi0{+r [TzR MO† Sh21N\wC톉ɸn𨣗Lz6*8*QVbJO|4qNL椞цmۜe$ FT=b5"A !Eץv\"B**^ROҔDbb$TMX7t-O!3vbSGy=[MG'-wQY2jPĊȉ=9J' h3Ѣ)$7A .z$"P:DR+ 5::(/9.&f8 0Yl׍{h0<^mw[!}ȰWVk :p#WZyf/ϤqWJUU*hX. $eFkuqe ·Lnɂu@4S܉M?>8ꭿ@ |{O3\|LyR1;Lw%SgeSSc[^z?өTJY ];2& nꕱ3͞h z W½P%΢X ns ODUWaK:e/3]cuu}o2nuG_^Cnzj.WK# *8Peb9AX&Pثn%ʈ>n/oOzwHI)8gIhڣC OGl~QOH3K)do.unĢ.Z1/[/z (UGgҖ]v651sch*ohQyq/T F_N+Xtėcb`$7)[ُ5ԟ/B^b7J) ~1ǃ5 j/5eFX zciN頦 NHr4{' ]ͣ&q\^onyiȡ0|09Z#M!M M4}9 ,<M(A GiY٨3)j?0h  ށрsTp4vmxb T)eVɊaZٖOuV=//_ i W@|WG\|UzރrJmys~~sj͊sF# ʫO W=+ߌ6r^1_H]ayu ނ]e'$]j*e}O8ł V %z?9;[[e+!4)弔̕pUWQQ}RÝF4^+1+J;%ι&-d5 hU}d2}#XFȂܾHA`L IX$T~c*LcFfLb\M])=7,Ȃ!n$ut*pqj䮯$ 7}\ =erAZJo?T?_ve ݏ|ғH+~ ˗*9_α_n,U {(\Fݯ|~uSyZ n|vziW4g/$g즑idmI!g++y'p=cb0'_}@(+m~j ϘotO#-o9<]Lb[JEG)I͛)D̯&zg4S =Ɩ=IZf2g4NczX_\m@ybBOhJd)k;(3^328jDףc`UZ1>~z_޷n=/FNZC>x)%f_N=2dLWfDi1/МPdhvQQRtv|u7a.y& 'HMo T9KOv2*&JLz[(DBߕq;3J뵭"%VzluBQR*RUͷ2jAr /"DSvNU'tZ *ǃ1 =Cb$hpw*SXE Z cu(kL5SSeW&uB"oAcY<SF=Mmzetu( rcp m(1TPtVXŠ P#mjoWF>bV$tXNZ 3h(P[)eGĘ=v]*Eu߾3\U =k4ݟC_~჆`3V|L|L> 2;10Fs(/LKf\\Q4= D#ҕ}v0WHzolFq4',,uig:رxœ544nuW>i(Q 5yj}98S|ZG+yٓ[ 7LT!`Dle]!Ή]>GF6@`ooJ&(eœsE49+R#H:mX"Q$#ʀEq=3QtWJrƴo0͓/!z@Ri]mo8+F)"oâ.z0i`Hɭ󲱓%mQl٦,QNŧŪb*/ %95E&850c`Bl׫b8[" τHf!S{)"B38Ln9pR!D ՗NLM3RLG3B5Xd78]TpcYkJt$KEB @*dRzJc%ΐc"!>h;J=uvqp:;h7 bJucy 8Al,Cr=g7-o[8u6uy~p.plҝ*w$;- MUmy˂yhߎߤr؟Bw(+-U "){ !r{  !L{K 1`DCd7BۙE߇e}sͳk's e7T!i߃9c`>45Y%qmFv (\CD'~ C9n$(rQd\D^odxcJkGi \h>NR[`O&ιVnث[vkG RWJXVD LqVQʤp`JJ2N~AW6CnhAfQw9g "+OH~)˄T+ F`[}yY(]%JSsL3pUV e u^#ȸ@pbB,9U#fVRRcIrA"֤)LR8p f"`A+h k43S:#wO߯`<k6YYymKs[z`Y4+h# h4ZR!hnSV[u/5+uA+ X{*tjeAuJRX5JTTxɹXkD]#%<&*D$[W׎ [ؔ0ɲ+BZg軟)PrPN .&yݶW}p]^@'/a;\^$ӬT XO@MaZzZW w6.ͥ5A~NF3s7xr n<;aj׫"{?QsDJ5KEwe`ttf.bG~AFHҝO'WƮM2;l')=Q챏b}{\b/\D:E +"/|&┥y*S~`*玧h yd 8vs|Ӂ.Ft|=v>Io49]UxpKࠠTf5{Ukoq[ko [ M]{CQQH}b=f:ib8`+z!΃)\i<"v D^ECF]I^r7v_: z}YY7ǯL$ kȵznmزokjû`'ZӰ~K2un]pVgW@¹bK[ 2/ \1vpGvµ`$fvmvqGݵ]v͞f.VNU<9*he+YH6l~s^V+T(G!UF;g]9AtEpAAz~# Z9BzNӭ:d>9Ua/4I!m`r8jTpds_:n,msD oEEA7WQѦ1csug$JTU, R.(]-h%;:0IIs6:;1fד.-cr-xHP< !@,-9o~Qp{3~pGLe}Y_=DZj1W=kaktel&,v zx[lyNjp?l>2Y<9hȤGMgo:k>{돳G\ݱmx{m1(#RCy_w>ɜ tElADfZj@\*^d$E'TڭR9͐q?xpmjB5jnO"W3y B~vW JS< C&MM( 0K"4%dJSN\2Iʳe6'SPU4e#K ,ui(wJf8VI3)F9x~7J_Ap9n|$xv{㋨}ʦQ_>=F==;Ԝ(xV?<}~Vz`Qk [d?ޟl~xWTv:HG\gNg/ąnn |ܷ\?9 ?pu)O8VTS8@v%~P$F)q$BfKRF݊D*AV҂p;0ss8Z DZ,)S-Z" &pXܭqé(ƝZɠ-^Woۉ9j-5}6ad 4CH4ҔB!+薡) xYev&4JP }WUo0L#*Uo?E# .z~Y0K֡b?lxۚz!) S-۫aEuw/(;=k_qZjasFHiɳ4vȮ,Ty76܀jH`CBҧR #Q []TX_?.FjGM}w?!ARGv>?cf4׫*oǓ{3g5θqTZiKXe:-TJNt?iFUj.m&8`h:P#h:jr1,<(*psm%7±ɷGAmD"q6tN6p[&Y%f–OlrQ~U(ٻ*{uPw˱G)i˓+k!oh>yJtYk$YݻXR%, H' m3 N>,!$|-mՔ*uc <^~W,f' tns4v2!%񛼾o)Jget[VS~8\A)~~6pN%̃nq/78cmGipMϬ'BrF]SM'0ԝr{'`2P[YH!T2 G r/""S$;"MR fe:!\L}H,_ c0l&{(u a*-HB ` I-aΙ֚hB}r^TȐ}=1#֜f:{[x,x,|5-cd k;D%g&+CRB9DIᄣ"GѴ.J;.D{Yzs07S\Vwbbv~4ڭWVΉ\l))T!eSNT 'Jqp&\a1:z᥾(i᥾ F~l<Ւ7(e_VU"C*XWU+#{bXǍ.Jۃ``!m0"/wL:k TdI93eZL}QZL}ykV6&Rq/b"PEU]sy;c]ꖐCy]2^:ļHKB@"ȕ`%cJ(+M48EٮP\ÉZG$Ni_wxfVZa%HaiPrJ ( +U̒1m=iQ[ZMBEvHOgĀ +ơMNRRZ+%ؾ/MJj =7PZs?ڥ[|ڶ>؎we7ؼǏkAӁ-7_T,׏x ?yz 6 1Hy5<Lrۻ L )zU,WL3 -pemD0/PZ??SZqX Bc(}8/=E^^eK{@2ؽKvZ\:8CvcfI]|9.mi`9KC~0S-Vhx3*}v,^'v$ʹJk`A(\ȴ8 BDSؾ)@ ky)ڈ1`4SJ 9!qD,KVIJ ql8&'l1( k-M r qW]5+ƨ`%z~cTõT/%F'*ARGF1oY?,o&M^hىe' #;!\}G6M`$HNa3"Dl%D}C]#pẌ́$9g||`=b"L2ENwI~0iOxaj9faw%/I/ I$UN2##Г$YxICA{Xヤ  JT0S"+u 2Hgdܺ.F?//@JryYص iy<>Zk*0u2V FZiD2YdMV`;3OGPQ_{pFVݕnSkft< {>M ֬>G&B"x2$g"Qp'36=SlC ޫN|jc7K)1%L1eq7!^P*%3*)$ R1'qEL!)qc*$H{#,9ujZ4nߌaz-x_g1BokqO.\Z;^a5&J`< SN^=(30k4m aQL+򂎙hci tHE krTBa$Sz%bΩ0iL l*pdknI^^lv=KR@ٕ 䏫mYaVyC )Xx%DI.|uYـ:-hK,dS^nU X#0j7/Su3F`P}㈂<"0Dq&|oG =ӄvD˱#0ѽ34"0B[KgA#0nUk_=nRJ+^M4b3ӑRoi1LHd`gOIy;<)$e,5#h`{D#u`p|q퐛wOn W4k|ƍ<'yv)O g̜y"aiGf.mGfP#ŗ\$1](٥L/.c?: ݻ;MFdj$swc#0EN];֍'#Q+ בd $ĺ\E֓qTL@M¾r [6ñr.uv)L Pl!LIׁyÞMwtbͦ͑ 1Zc^:nx*:]X61|cQp8@7ᤦWz~VUik]r.LRɜ'RF<;,QyTX؀`VI fI>V3%Yd3"[4td|jPM.OUIZ~Z}GuCrlX7 ݇HU&Kkߩ5! g}: ql4*"}O£h"w`D q>1\rm4o+kxO=T"T{i`m)q{=rk⧝-WWlqJ)_[o[ʶ9ldž#5fFRAԐqC%OZ[VP~I$TdV +Y:t\cC.]K#F{e>ͽؘGyͰ%07vy\NqV;dhJKd @=qQax2]Qmfvǟ,+2\2?|$!P52CM `Œ-l3w;~H/n:*0Ϋ]5jDv#b䭈u!W? 0㧓AqK^GM@Ќvqڌ]O)ǩ GJL =hZ.FιݫϠ7#npjuté/6܊L4+$RAqЕvDBwplʕ}N ͵W'SpJ'CxH8PO ]矠o4ezX,|8yw1g`=sKIIY~:S;,%P? ;>seVf4@ A3 *%T3(3J;D(mDqTctVpҠ(X0u\ɟ{:w:] մrx kލ`N'jO5ケvT':|$_ 5*Y?珿GR#ׯ!MTnjwW|ؤq3o+rᓽ.$`)@յ%`"z;jh~^cXP{ .'*=$\N0L+p0ksD*v:Nғsk $vUga%MTVvŕNouU۱:.r ~cZbX1[MgI@B a"kn;$F̉:TYT_Eq`M]Kg 1w֛c%m3NDV3=U+J~g )zvZ"-FY >_|rrv;f'#)ڜG$R[3WlP $Kx]ɴ*Oxn2hnCxGŋONR"1L֔R+6e3Ts.U={RЮOf)3~N/n1ԋ juPgޚdm{P`WH8-u{:hĘ&vLzĿE~p'.p(% iTHk?$[A>Q׼cXcZ6C>c}4M#mux58Ԕ7TcVAͯ}Rsn]b~lkBi+hˋm2_Lҍ.G7_m t M§|_1H#FX /9#tL2W v_oNNv=aiZ-7ܟ/EOѸ(wt8$!_Ȕ@lGڍH͇R1w4nL8mT(juQ׳̂ܚܽaȿYdKD5raS+v(*|p;Xki|P5);.S^8^f?5"MS(AD׮Ԃ/+0smD6&~0*Kjև] ŕSNV-ٷ6 &]~9\θ=E4[Qav9UX?׌7eGZ%P醌p34'GAcڵųo?$]p./Y ZTAQd3iqD rqwr !/c2eڌp"GXf,3h̕',(/;Ľك5MĚ(ẒaS &VWsH 3@H63$fqPHw;w9רS"R1QD刨9&LuXpc,zjK=ޙGK{; 9j$ZIiѓwQpzFyO|S $ȩ_ؚA cZ\CWRw䴝;_śh*m*Ckt`N ~C:QT8k41!!*3EV<%%D Xr"e9=59KBqn1ejW$ZX`CV+Yd]ɢR*p1T|~mgŘe1"Oqal-nJˮ գc=0 ߍJ$CjѶpIJ,Y'=R`ɹiJalUJ|A!#{v9F҃V RkNPLF23ʰA!Ú4WNU՛P 7׌[ g9'1t[ڰYC FYLJsWZ^Εb씱`T Xy ) l0Bt9R=dl$ʻ=p x=]o (?\{f&x ƿLxͧ71Ɇ|{n2-wc;lDb{$PۃOo |zD.f:z?^(GPrt1:􆃟ڗ,ÙK{{Gq/./ \ f.vbGg"s,58-7Fce)']Ɓ?M?JNX_+3Mna>l8mSL M)5N8.aId `x1QMQJ {N((Zl1bHõZ;xil!ZR/#5эj5#ZfwLC>;kRC q:yშc=neSɃs?p q?:Zg~lR%gl bΕoh䖿}\8pwQuES1ύa|c^ُ~2)6lRNJ2@b(2ɰp fP%rR;mJQE:C."H͆bj 됓&C #펝zHYzDXͯLizg3[%PdwT(Mbf`)s!Xw\P- O}tRZip$H ӾXm$-5~&P!j=u[Bl<*El)#͢$nm8@פ=[O[qh+uo}8xy]~]0/:$Fὗ&EyϳJ{?a;%g` a0gh/ !dø~p4gWk1L?5i({5L9YF+wƛ};(Y)KZqZ LҸ~#-jMtEur3bJ2t`g$KaVo>˞u7d䦯X57WD/Zk-fBhrFLpEi{͝ p6|GkK#)87)@Ct pu|SbH(RنI  uy` P$cỉJD0B,>Ë[2|/֪4$MBڊ'i_XD-:tRJsW]KOKS#q}Ie|2ѣֺ"P0g> TRI=XU,RnN"|{M\0<|,?|0uDg3W_,˔'\.ޝ3x¹n5Y 9 OV1Y%LWz.;gWR|[]_JEgRH/0C]3хaa =tHx-Þ:)µrnĆ)EDIl()b2C"fw ʂk\  $&-G$TQ8$NZH3*,*:io v %X{i$…T&Xp[)%EyY ^[i,:0, rZFjA@0c & Nz. xKײsz=)5elPϾc^: Jf?W~7x3ǟYPw*[A%@`:* vW6CΟVdUtjo"B2p?w%[uO$ Jߢo|0!t?&ϻnd|?Q"X Ϝ* 1o !PQQ)G٨mC!( y5K`/,zm0V:1D/~ey.OSsa!bI#wm~ "w0"8d dc}[INv"lRf7[d&[WbuD=jݼ?__wPhO^ =g=N%H޾sR-=crPo_ޜ|| 90'1Aޠ]mQC+H*)OxFs fn 􋾺ᚗg6.,t]cTgHN <ه߮] U1ӭ+lv5i+\ӡO'?\ޝ_\>3?OVoE7}y,OU`c4l}9 Wd7O . *}sR˨ΦbjrHIѨ8mΓsmD0Ի>TOz2b*rJ%X6?ymBA+"#9?;z8)քNؽ߻K<:UCC/[,1y+bYs T!u)VT^oEae\V1Ș=B5}ZH!)IN,p17 M%* 9Z]GG_iT |*p6H 93o0VFqABUQQT[+ B܆Y ꮓ4JsDqQ}KD3Rrt@J9 *A ]oG:3$JWdOQ?4}ܡI$f;K>( 2,~V`0( ofcžŅLI A5oT$]-!T~䂂/ί.f#`_ HXSEaW_P YH1'qO ȜvװBbP+c$`C4])}3r0ZÅ ,VEo ~a֑"*K2&l?U::JD觷}e%T2z^m ( $QFpN@'aWMnJS唟bqFK!Js^X[we?q36l'QnY$̍p>dfcYXɌiעkq~uՠu{=#Đ[Df_9:^QyfhO7yiM$%]KƴZP$h|mֹnr|xm^#5V"jVjQ'o>t=W[Lz#L;FU-#ci2|y{WbA$WwV$C4f=thL9\(FR$y1O5&yÞ$ &ϔ|1YqGe~r:,.t9Y@ZA[TW*̴ҸRRVUa/NqBegTe^:xk(΢ xUR#BsJJTJu%7hR1$5,K8/Z1+Qop5D`!Nt<ӰX AF A sJWj Zǝ*& 3$a `-۳rym-fbffӓne6]%W#K¤}GOW~B_ݾ;)"Ӎ.&B?? lXs⇻˰; _߹ŷ߹Yux>XC[dIq6ߞ\`%N` +F>k^TZC{N@Xr@^{'%`WzjG@EXTA(QRNdH+sڀc`FTkd G'b6 )qOLa F#  EI,㍭vK|T 8*L{jUP$$֜TjqL&'+/7= `ԵJ{!̔9SéŁυ'Orްm2'Nev9*QaH[+Dx iYآ@ Lpʌ J8SQdh^:_ޝ46ڪ7 G‹+1V@4by;ғnx2[%fӛo[ګ;JD!5GsZe%p<|_}KTxZ$0(NPdm Q1EᒏڛG_dt|Ԅp7G[\egO1i5N9K0`|D7`-<_TA_:PB5N8Τ<跃I R~<2y&P{rwEёhIiE >]'vHڦRH>NNvRM#ҫSy A7uI]tK\ڂ$KFŭmLyۡahkw*,nOulQ]ܞjEs. ?Lh( mIfesP(敵;T3BǍTܦ:|C6 Uiɪ}='r8Ux$o.TnnZUU7S/'odx)Jy_^ݸ#p<ݑ9<_{hy]!ُ̭eɟWdT+5"aꊡ# y"Z$S 2jOnNMۀ VA蔎DDNej.$䅋hkNȁJ6 +:)9%Y=dGdÇhMFIH;KQ@T %~8^ Hl:Ý]U9~/ίF1]vARPJKi0*25Hr3;`E^2?5S|5p_}zoNz vDžx"7BZ0a7/)5|ڰi9$yQVMYm5مof7+Y:.Z4vxRT Si11+"yݲ D?*.AYx/R xYה- v6B%fYcjH]26S쯲4Z6, 'ꁎ}F`YYUBlj4C"m&^4Z*_}⭕B=jGMe!`Ӿ~5:VA-3$Rҵ2(A[|[jX,QMwꫬAv V]jfti= i%WƷYth"0X]Ԝy#ѡz<mr~~zs6my׵iCUklKgٗpExvuv;Ӄ3a4(4\vAꙃ1MX(2.+W; Ӻ-Wۀ[u,՘l6pg^aEI[bj[ս 9BIkWr%k|Zf6 X4KkA2㱔`+p$!HD:)`cEYQp X%$b٣-l}(/]_M$t("Q$YRXo鸞r_EZHEy% rs9; C-V ZRI$J?m*L.㗾o5JgzrrxGvKEPIDyة8';5WtK{Wk`օ[Kq@ :swOr^} D\@٦;Sb|!D8T%)xI=j'8@CXٍ\%cpmZKg* ]% J(m+p!<Ǝæ ZK#`ث`2{iaNZ9Ep 4Ah)ɹJn dsTF\~^c8h4Dxr:ǩqL,*΢8KU vvq3 2)0QJ8CCYα1LP)| )L~!z~;;],X<˞v C7nEU:hrǐyq4*&$RɯxjT Z˂LDx^V[Tv$#\q (Gm4zO?t}cE- >=f {!Sɤoep~v(֛g)7BfB%!L* 1p#δVF.k])Ūcu\R[\VGi[ٻS _?vV܀Rǐ@c($\f>Cf':rYVS<y:*N9$yssॠH Rh#bFΈ:NX-\X¿zڥY|, -s F s'Y#Kq쳯YVQKb1dWPfu6(F4r)dKq; JN-jw\ERVRsIjPKE%x5 NA+{BJ J3R| {xa_@@/:C% T(8l`4)zB1]N;n }&\ ].o4JB;Hc+\B e[;S1n_ Hr%W"Igt҈FLt,OҾ;}s,FfZZטFs<֠] *,[j&9;v?SDpFZpAR4\e%y.d>~ek1G5VkcBHjmYqJ8CR qFjQygIJē{HOz|,Vy6{Q20Ku8㿳!D2k0ixYo~uϓ: -L G$a|}C$ S1+@t)X#Hn? O_ԓ[ZmmJig{AKl?9bS;Otu1UcOl$7PZkG% Ie Kk+$K|kנXHyjJBw˗;}lG<͒rͦK&UrJ9śMM2mSIJs0NJm=.2Gű-xR̄ϝl}}}^SV(mU񨹢ʵ(E6:Z^i{} U$nqxQhU2cहc@o^z4\1Z  Us kOFE_vN{(u{؁3FsSJ%#_([2Bd.OPUʨݍ؏X'N]aD!૝OO惿.?}gdPäz$Q{Wa/*E^ȫ^8&/2EZ)e̦*2y16ɭNY=Gxg?g%=OunQ+OM':l1Gp)b6\S5/?Ҥ_Oqx%lOC%I˾M:5DbDd>Psx)A{6FUب Wzb`sX֞pʥ4.\K=.3+&1FtϽޒ,Z=5zj:bB$;~$VB@N-߱+|-Ƒ֟?_/3B?,(<^d8@bNgr&AY- ed`463N+f4Q/zazEϞ5bxà-7]a[9i@~{Dhɡ="*1_PGSͳҜxI^eJ")⚊h&ug5p۽5GaN1ְDGZZ8"J{թu xF$FJe KNiM.2K$“ +sӌ+1΃ ` zύ_/Eu |X`AEhPmZ| Fz-552SHQ(}ƴ!|E2̘~1pf*5>`NQF)$iV +h*3KJ5!ԻLdIhPsz2ɌW]׷C@hB-I//~?,=0+x3ů>oBR| 9+ʟOdgnn2U!' *(JEVmK7U 'ʇKJ+c{'<*mD cE@ b!d+*nSRP[8V ʨD1r]dɰgVn-̥澸N ʌ۬ΔElH 4h*-qXfI&~8a̵g9:4w}*i2Nu֮`FP e<9> Pα<)SIR2P"}YRXQBGzc3%BgpKHjrDf22шGvClt!&6n2GzpRoBG~ӌ8!9>NrηQk p7'g?(eL7)jn"BV. 0cIƮ9TxsF /8;V1 D5SNyt(D*mfΨ̆wf2´E0b0ҹl*RpJPχkf63t=gf14SmJ&+}EW;HNDh]˂Tdx# #n:d -xjZː¡p%4A4 Gt0*i gx{ :%:jmȂ*@ֵ  #pWi_1-w:+&Xk**(epB4 s0 @TZSc hs,S 'SF̱Y[5Pœݭ1C8wzF J@b6C"pSw.ạ@*$̘ZK1b#a 6L4#YW–VjhD8dDۍ >Ʋ8khmw=. `Ђ(*nzq1dW4TSż&JC ZmWDT3s⋓_7s/n{q4aGL( 6cg(xSSقsMho?>|KǘŖ,|rǿ4]o0MIn1oϐ =cPGl1DBL֠AtqV}vzquRߥEt1-8S({<#}q72O{?hxіW0X|qw{}O6b;RTЏ2bWDVa G֋FAH:SEka8~>~v! |󟂻5:LnϹ?j4/ 64cAyx0xvgŏ0ƛ{st5ý* BUEQ0ך_aLK5ғ/L|+-I+F2Uu˦v+ݚb#:M稣VإvknmH+e]mD[}|W7D]o\x13{g]yxv%[_Z )܀j"Ⱥ\'KH >]r F94dෛh!TW7fQe6xlqr'Cd9AMӗ]2gydLz{>mhK͹;F ^۽tILkgZ$&4G#g~ ;+qO"YUU;hAk\<dXCm眅~$ղ'$|a'\Ʈ,[_q-0jY:hZ. 3յP-hW#6N=7z1/sI fi/}RцMboXӎ'U(f$~u:..EB iRX_ _=MZ~2[bݝ8?ϵ7uM{/vOxνٽM}fO?_29#>jIdÖ6ӆzUy~ؼ 9WwbA| AKvW|6ݿFr' ɨrVzk2 %怂u`S]6UqmmN hb+-o*)m/uL6TK$mHNEȪt H 0X²r0uubi`Q&` ڢ#3CE`4jTτ* m x}epf]1;q Q%e^o HCl_Zr(Gͭ5t*kS'ɱWҔ؋›n jJO|$b|gEjVj~\ŧ`˺zW>| >jQ}|A5l]i[.;z7u)B߰dcoRzsoyb!O3J6a fxh?&4έ P6:hTpзG yxTJiOj5pާo?`-7&b XkI4z"u:h6tsNJRT\qaTb Q2C3"8Z@eO>_<Ø hzAyw D0]>4Pmz;$gl:ZSR*ov9ڢ^w_A %GÔ[lc̘Y#`oֺ]Eo_3OuQ.kS_# rt~. ~/N+UddUDXJ㩥%tސ; Fe7ʓRI3SQ1Pϰ%'K;*f)6&L 4dsp 4\̱5FrXۗZ2mNB P щg(7G')^hszoDlԥxv#.0bc:2:f٭hoJ04H?4,6'to6bĘi6jyY"dD|~A-X0ZBiE]) G.SMуRfvPCݵg^^3qnPj@z"J.U\e/1%i'Et,m @7ơW@gg1uwQFĨ'Xy$~ucfVEACDDn*[ ʐP+/W4w\*iLk J!M 54i0T)2Ă&TBDt1Om!Y] (F,8WZ(ևu*!'l/yD:v FϿCJfkּ'y>zTuXwEk%.(7[9+硠ǥA1^LĠs:OBH Ls蠣'o"-QRt`,?U$:%'R:6=BЊJz'̚I=t"@@dX2 $ҰҰW&hByeCRj#S9nдƐiuʸq-:J?L;O9J2vӄ\ OS'g&ZRL#:ՄlEM:sSQ}K Ȅb hM#ilM:e M\&`h3,cOhc<&@GZgY_'ҠGjnovnAž]+֛inJl?փzXmyŵ6JhMH_Q#EV3[[ 0lXt*~[ۛwu6Kc2jν|zqy96`!H(g!2*E77% :d Bh2d/r(z\9hy͏#;!hqXYxcɐ_9iVUع-[cBZp=JjYGKj?賓ZZb wYsZ*F*(MjNe&k0aYINocItPa?ǖ^yIS*m༤Rܠ!1J (b94n|)=Q1T0*@ Z*M0LdҨK;p  miB}GNGJOޥ&Xݼ~Jϩ?<޸hOeiyu}}z%W&+?*]T8bІWU,Dj' zkDooa7{>ˇIcIL}2ZF\{dOp) @欷{fttx~/B\ZvVLdu<NI ơ/SvG{֒%k)  2SJ,)N8g8'FLk^.jhz}v@LέX2hE/<d@`n,[ޱ` cbyISs{b"/ZCgv1Ybfo=,ŌxNˎ8iw aO0'b AB!Bz]BS]eXrc81TH^ [Szlk?2Xg\U4(&ݑ lӨ!_GL& VcK-m]EU%ʢPzbkϘ ,յ3Nh%h gdnu/-yzI+R8B#;H/w?gJN%K?QCrP  Z4Qx?;q>@-Κ=ھH!MB&*!OEZ40'p|&nfoXz -NQv8yF!:@>m>~XE"kGy NauXʁKp"-ߑps6ܗ( ńT=e'&V![I VIHxtFH́U2^{)f#q=YX~ZLOnY=-)ãqC[q% K&o˜z(JJj>ϫbOL-B W"҈^ 7|,{=F%4r10Է$00s%ЕD6; 6d]#n)/{#X1g`֋4ueI#~ywQrBס T& \4&J;,M[NldWe/81 K.mꊢ'6<g ob=sSƪ(۳.'KeVYxcɐ#ȴۉbӥ7чֻE{ޔ87EARD;h+2z=SHI]T^YS*m,.RIn9xܤ+ĨnH1GzX5zT$͗8XĚi$BKQ1I]P+`1czDCZР &}hc 00T͑iX0 !:b%+e{GB%io30Y:nBk.a (ڠiȴf̗FW jTCVb+Xu*51`zکԘfr&A)@(TR&Ƣ epa蔈[|Z"j\Y4  0{Gb; 9AV ´沍A6\0R*o)bP^8&fC0e6hAiM2 鑽 5I$j{I|R'JT"qQ3V)k>\LwXf[6Luu%tb7iۮ2)?jC Ed| 2ofAmڣ`:o.?~uL ȉr ^S>CN($o쳅vFM~O),Z}&BRV<@݉s^?5K;N.>H N3dnb\~uQ|qA,._=}[@Yn (.-_މ1UrFj;NUcA{RwD`+/(ARQVFE8o5,B,SpJX"=.y(ڥ@9 D 9%2 ՒF`5XYm;=5,#3'i=7'udՄ _.Vk*Мds )'[OǓ:a Q[# B3A;I˗՚>Xj &oF,RۡFlo mnvI6̎z5~T$Wo?=ZydA罁N]ݮy1]rZCWW_} ˫)}?/7ͱ@.o.Zm794ǧ;EnM/j߱<!^m Z. t{q%wuoC7q2@'AI%cԊ7;3ΑG+zMl+ +fh:+NNKDz %sr:@Tf{0Li=a KiP[ւ1zN-ź*鮡Q[`{V!V D1k*@O$OOA4}۝O_Z#z11)&on~*d|!Ug^K+qAe}u*+ؠ.Y 펵j\<͌аE[*M QHA{bRKSN֊8JEI D,ݕ*G su(k^%sXY zz24[$3IN(d2ӶPL]4THx'z5S:-0g;%Jֻ<m9Tn4f4OH[ 3M7_6oB,%|I F<-L,) z!g)y:Y+I$#hM[)lm6=Ck JA0\8rJI'grWqU %!tNGa u.yM-& ƞm7p2TA>j6;M)b]JKF*E,xMY9I]HْW{VG~ @ ,a"+|c?hPKS?!XT℠5Pi2\#YмrylMHI c&dOAYH1ZT4`Pus(61KVY3缘ViZjK hHZ4#EK<.\٫%WDK5P7 gTŨoMROa!Dlxy7+:GnN;x]!J.nm y&ZeSkn(VbPb:ݎwui1w\ݦWnU6ա`_Sftj1(1wnG"vVnm y&zMYƻ](=Xl (,_0&T SB{sES% Z9QCeL MU6bwyPc}K:,+PR U,#kխX9aLYP1EQAg,tITD[0[[YۦW <冯֦eh'1U3`0OX;wPj$΄[u=JJfoJ~Fu'ZzSv {{lG+Y&?F:/S^ Z.d^:i\*tp@T=Uȫ\PUxOmjג*~_ʫxal:~k#qq?Z_dICkof#\OOIk։4gl߲>f3StB\m_}>NZj|ߡq1[מi`i5-AL8/jO݌-fr oP ׻$_Ko6)eC:kxۜZv N0^⛆] k‚P3w~E[DnOⲔVk~H۞b͞n~be%9穑^kVY#c%b@*S-\ޒzxme8&IDж*2]bRrx] E,ӭTNrw<ǝFE!ZpNBp0pK]L-:i0dX.Yjl'І~@ |;]|5!הRl 79iΐ|&}!޹|q|iqyC!pf@OE(R&@.d!s+Osd-\OcGiQh!h K$0CM6GkiNi53 Z3aZ&Ĥ17sNHfj "6Z}pƒn(=K:LuێrFZ g6yY'(BA-T4r'}D/D/ɣl<K̉) 0H>6H"~߼kB/-6DZUc=*!Y5֔ mS9e3K }􄲍iU N3ZH m)\hƠ%8}PLNWk̾\Tkݯ}] t\řd{37W#mӥ^e ]Kb]K)v&gQ~0,vO>gB 4 j@z!jd #do{B<\ ,8wcsk(A hǯaW]! B -Iw#ƷF618OKR=IHh\QB{)׈-y]o/pwrCC-cp{ힾmQEۉ1IQEF~'cϘ'˩n'#x=9Ktn.z^Mp>Ht2Ohm40HyKKscH \^}?@ӭ0ҫNj"9ҬE*Ԭ:IBȝ[ɤ"` p6mVdchPS"Qʮ\ Lu -f7NZ >dq3w4 FMseȬX<1#rN|ZcOO6b j1(ag}ǘG,>+.l 0r}bSЦ( S%s~N? 6'裏bi呛C{}Fun¼oB谏T,v=)\3b6^ڭIwav9xxƚ5~ WԈmexe y<.XVNiYӂ 44!H5;sJWҸ齌1Vr3ѵE45A4Y&3/9UYySff)iV(j^"Э7h=Q h9|[l +(,T B<͍_NRJR!ˬ-ɘS^B{<@]:\oCqޣ߶{ÿ\BGJ/EpfԻ77}9Oɓ;3w,UU\p#vFB{.m·w^UJ/\ ?DLlTf|m]3xTpiRh)S#d1X8s{#+T5BI[pcenTd`;0ɠPH^41 )[BXd}ϑ0@8ɔPꥋWJ2~6rc ZR$ ,Ȑ$Z 1q IqOWgm\I_vϡUydv&5yI@UGS߷A-2(5V*64n4*ppkm)8rLڜk֙3,iLm*S/ขˢPV׿҂B_[G]˟qgX< qN 9ˊ `nTbUjʩ2zC/V&cMB{cRV#2֧@wUʧ %|R)|D2LH&f4f;3&y<\[v&$ ~8Ԟ1S?f_\tܙKJ !J"f!$Y| r=+o 79 anꖴelp,Dz2W=qFH##ͳҖ=3F#1f%c':bE~AYqlZUȳ ,V¹PqtY?wyvu=trv*PBNdhW-qO8?{{Wi!|`>*u~Bxg`G Hk3Jι݆;܀\%.Ҍ [&"s6($E&2B9y&(3Q AJREYj2[b `jNyb)ÑS>ۉT*G,<%.XpƴvR\hAld(s6KS Rm;|[ ^i}m׽ kU4u]նzuyT}OźQv!ün§U]C#Htʎ0̍ńS ̝DJV5beU2Fra DFc3F(E]ݘ5a=#i 5ٷ'k.ƷI48|L6*LGϕ.&Sa >8^3Y~3Y5yx~(2?O'֭MR7_DjQl~]v7uyF>6EO!_G}t_43aO+MfVu_p臗R+i&=ZMK=QR eu6mlZLM ; gI|dA5˦%c%bH!;V7Yf}! n]k'Q ZD&8f ܨx``JTJmA蔩< #j0sTo qtjV `Sb4<9,7.s\P &$hu,M Ʒ%! @]}aZL&`Z6u:dқwZ7 /EӺ}ApW?h_ˣAТaX):D$!  5$>-[_6,[!~gk:0m]z<4}̖?Y_G}lOd $Fd`Z[<;6S[*,Vnd-窟dv9ʑfk`2y}LZ.]iJ9h= GoՆ 4NrN3鼍QP !HlFe r4D.dN\sKx&Yf6P!橉up\n2j9F߆pl_lԟ5;OӀTIZ>w#ܲ6n}8j£On}X,nGāP~C(utz ^}/4@I4G}8ثzW{U>WIEDXA1rb=w퟉4B-m96-9S0᱂@:;ڂtp.1aXIoPD$>:j8jؤéчm*i|?ߍV# 2N;-9GF}[?M32sK4>`+ >@AKٽ?rM\<*^:)+~o4ѷ+rM|-b/[J*'xq?q+<-I(zv#jX BD'v&^CEi@ֆq)-DAb":s4nG}4/ZG%j6$䍋hL؆Oh7 (Ør0q6K oڲTc mV@0ܵ~䵵 $*::Z;v$*egY-6Qq}A Hrt>uvYRH9L䋿-c>.b8 ~M6]\^Xr|d@Ll\n]ZCƬc*1qS R-6ibLJk}@;U`A0&41$GDHdQZUPQp* ttAw4TO_QdcCYjQ;E{toHogܒۇBc`O?00L }T˃r}yp@q 8HsY[l^+[=u5=vG޲0 -J-*)JQM:@W}_O%hnCua3ZYphճqYٶo'Z:tI̞ܺ Buy$Q "mFR*T5FHHcj_7`?o]3-VH.{Q9#G@a+a^ 3>Uatp[K0,n]\~3IzGk]\}V!({+C߷ɢdNK1u^CUW"+4*4vtCg{W^^o0+!be2rʁ # VYXa]i&FI0r q5$9c'Nb[^*1oq붓&Y[Vш^ʿ!cph1 ))0,9+s WoZ?n.n>NqeqрjM fPY=fh&hu0B-9-Ly&IƤ̉Y`5^j?2TL0Qt hPDX%(KHSM)'Jjx e2'`˜6,KS+ aҾW'!>yƚSփM λSYxpoU];Éq~s3@aDA5E9)NlR~y#kfOb*BǏ>y{'0S^O^N.giU0?Eba+j#!ArgC>R?q?PDg6WAVbQa?p5tCV (J4 Lu#mm◫a!Au,fs;6a\uΞ6I-dv9*dSIXv&RApV}V X()EЧhO5;BWC.Q GpT;Sp8^33L|FCN#1ӊJ JK!dif5'`M4 e3™x6AK*M0SRX_#SK3rR1$Kje.KᦔMb o6#Iphs%N(@`ssI:;)V"r@*HV:(l&߃RAfT&$N5e딄DIV|ئ3!o1fvs4xM-9bo y"%SB nJAhH D'rFx?luݢ.j$䍋hLr Itv J:}ءH~[clx';P0f(QK?QG{y)\ќ>n##攑ӕc¸$xɿ얇Z&yY '!L&; X#"=2p}kdx&(zYA<04h 6 ÄTnڴjU{5ꔺ-{N'9{S'd)ט'hEk_֌Ànt<^`*_ ⮊mZl=F Aax5M GPR1t2]N=:Щdmo+vXXPJ^d2,[v#-|yc#Y.<ԷdP VvR@ɛ$L |+Q]*6DŽ J>"+tك3 t]-S/̈́"47 &SČNS&C 1Nh e _b#"ojSq z 1GGR6.x$ l_ {b͗ɝIDDhRNm ?bŇOO)~o3S3_|h f3xXE_cL;7ݽ {,1'̐Xv9> Ū1iC2Bq15noVX<;Y!#jx0H~KIf `SI]{89LcQi턳rbya쨾&h),_Z2Mp&Ŏߎ6PnL^EYȃ+ e혛"BEz}uyfOźQ3d`^)z --"[9!/&^)&[ݐ^OA5@r[դl)!G<ĭ&p D#Ղ\f֦-xF1B98bsR-sRV{90H↳ѥ8R%0RʻQNsR є,WY+j4ê05sFiH Dr#SF2 `X_=6 [J~b~H^qJXqpy|^MW* 21BX~Hlv Ȕ2TSf1E;1VH2vga\)ϻKItoՠFK# ; Ymњ{-<x~-r$#T1jR7gLX7K g\hil6ǑIBXU~0=bwփ uTնgRGJ$3w]d`0N1 _6b*I87rFDr/ 61{s$z>$}e3bT M&!P&T8b|8+2VV藋^sFN+pǀ IM;o4i5HEJ|jG6 %Z@99vX6 $ȎiB(K*2<`&W؋`*as5rj7{^c]'>~zEm+u`u\zK{|K)>0//3pEvq~& /#%!w1M 7񭣆Uڛߗ]@nc̐f}nyQoyRk]R[g-^IF.{$=:kAFY(^k,[ í1o@YXy{ F{vp6#$rvsl9 D1gbr-g'֖hK*0mv~ P]on>Oʞ #ʷ'8xQ2i?886:OchC2ʮOw^KAeX֎F'}@(D1x-Ȥ);y@B.&?Ѱ1¢ ]x6hu2=Ղ`ʼn""3<]Lj}q|O{| Z*U7%|Xf@ _gIx71fOoL{ZI{iS,Qae6Fl%O'/3`L~w%Q cr4t_^9jLEڞl[F2MA}Ⱦu]Fw.йt%=[}eqˮ $xdZZ?18C-$9\`Tw <90j\OF#7G; |,{-!B$Qqh۞mΓsm%>9iÃsq/C3Rn#w|}bv4 B$9 Qû{Ť@Iν#Q68{A寚!"b@~]=J=?\Mֽ/9]|`H"Rj̊D@ld$VRTS/ q`Rj80^' *!ֆ̆Y/p{WT=C5=-Ad7tZ! Fon`d_Gt3n(I1Z󓾍qY l왜rcH'\ ͭؖP24*!n&\ƟBi׽խK_4?D, ۚ, _}+/Ńu5u.Ge_5])􆳿޵0 Jy7Jv_w)0֝JNکrQ,4lo>lp n<"`8d+a,ė:Dm._F}i#: 0EXFQ[݈a$3^\Ipr-^¦E$?#!+ =y\j:K&2Y_)6 8p0Kr<`Xq8Æ^J/ ǪwFNvYwDoOޣ*]=﷡/Y>z4a+oUSzj|ZCĆYeqAWx0 HmT!CpBȊ8bW1iZ*-uU -I-,ƆiBq\R(7JQ$UpܲWF,aR/3}&&J=1VQ-nmV# cD1Ox_-1b+`,5Spݭ>:LPɕOTCJ2_!~au=_bQLis2*۹jGCDMysײn&(H}e4EByrNf0@2O;;V/}%a<NP7CHNxc${!bt.,vob2ʟ3@P^sI l8p Z@K|p" % gs-BWumؙ!OvrӓMO;7=PVZ&d >b91- wc2 ,=& + (dh}I71f)IT8#DZ!KwB8N}uX^%=|T[6 2%t&'g~2 (|^m͚6 ñthC+n)ĸU aHІ14*YYY(`';y5ۡ`` EӤH^ZpE|91pzlZDzDrnT9b bULVhuOqig t d*5W ,s|Rzzz'…]Λd߮þx{͐E.YDdZo?єz}2!1I0l2$PHLPyfvW ~*#+l & Q͹r`$z-@eKW&PDrih8-[@VFl]ŊiabTAhX0:gH#pD *,<ސCp?? DDn ![t K؀6E՚horTIfjhޮZ=v>p/8n>񬋨x]jǧBr”} 0)G:RI<ЋM9{ty3Kg= +~7wnC`CZEp8itkkgZ9̆z~wTUϳsYR3?w&4?~>4zܜp?O';?g|7k߲1PDQ;:}|{!r3{!jIYsoYe<ܜӿ^J쉄ߎ:qIhcL8^ 0߬1Ni$$K-w69H,XoОpNg~ =!w' VX OueɿE@a(YP7V-:nM*W(l",H@iy= - .ݝh$+~pHyAMr)*FT:=[e VGY;] crJÕbr*ziEUN- cB 7ʧMXwB ۍ;w.+Z;@'U0{{19zCGU.f[(H(^oP%8\aٻ^z6Iy[c0^-d.}w7rw1 0'"6$aQ"J*!dey8a+J~]ק%FǾGsIY81~ O_aqaq]wX,`7Y+ B{8+Mјr,7)q>HGdArzL2nǍeW,{,$| 8Ig"S Vu0BSF2+i1j0 wI(e{R F[Av,)OJ[64üp-B MՑO;tYOٞ_nU̅zb0V%'"Uw(.f@IR&j{KB]bts25 AXP)0f& 4,]L[6_ြ$$2Hh˘0IT1^C[DYa&Gʦ c=e;5 QO( IJ;$aEюg :$9&DrpƛFr$T4f/^:Ju(]u_P! U_1mUyKØI+/PE(2P $&R|A;PUE$u9p钠[Rrڹwi^UX-jdKHVFhPkFG F,8n Covy12^fZN.cםFů蒶 c`㕒+-Jr*$48%|^ a#zI]?VJ"Te A4X#c9K&W3XZXQmnj4gd4On4rf{#ku}pnjǹ4}GǗ;O}sHsم|"%S\.x{}A햋A侣v<|ݳ'ݺ\DO)?_6ȡs6%\%I6T'ɭ:( uYvbV@P- kez&mFkeCO u`Pu0=! x!ǒ.dGȎ$r^)~ E#Տ &߀,ڊ~̰r_ s{`< 8ivU:^Z/n~[DBDZ}@Zke{JzUl<s>DJ/{N0zHb&oҔv\4OQ5?WeʢIF9Ukō'shCBKFM'uCAlK-("Xt 0KifӔ!}: 6Via2@ 3Tc `VےXfT8*C8EFRNK#]28_ 8+G_ګVjp7ʯr u8)= ͥx.V.q\7B/PnUsY]")#C%Stb9Ew 2؝۲Ur[}1#^qpc n@\'{='1NŨF:~|)يyr ~&L)]JLZ{`*KCQ>rXx ~ Kbbfkh'k;v|ɉ5Fq}׭p5%vոk|) Zl,K+0E&3,Ig $A^J 1K4#CG-&Cq8648CtBS?pа$av5w*whZÉ!x 0#,Si[LCc%XcܿI$Yo1%M/HsOfd@Nf\F U̾:96HVa|O-Y%+dw%iq8a2}wdyÌoM~acpl b],P}p0>xMʐ6}#;s^_3_ FRz6Y¢e,VGhL y{vSm-}Gv9+VFo_'Qu!!_ȔF-=&x햋A侣v)BZDZ-kvBBrݑ)(S|"HUwA\g{tfCފ ǞH!qO{K+,ĨPF x$2D>>e=[NKE|ҽݳOi:q0r.-AL.=uYn OH8^s#Z{eLIfճraQu-n:o rbpN.x%Z)[2'cB{ᑁTq.Mjokeo)?4 9<ݞ2ƮDKL$D2ɼs) iZJ>: pDo>c (QB/hmJQ%$ )a4ږ.H!N3$DE 8ϥӫP*FicbUl^cdV[M$A]eXᄦ/R7x81a bL>_vϑn\k{nF0Ph-Bl7lQX|QK`b)x7&߼'0Z|>4~/6=hNҮdrFSCTB%WC)쳤Kv+&䰄/KOA:W(gaJ5*')e![+[YH]R"l$g iY.υ ,Jmʬ6'[ ]b+ܔ{@^ ==# =HcU6Fէϣ47E+% \1ż0VG )!UКr_ܾ ,/&|PO0dpbРl&;o'Yhek$lKf@ xvs *b,^Z⦡dQ΢Us>ϭ8q,&˘ic"2$< |E״߾ 2@ j􃜇T3 c/{ ߦᮙZE ۹k &`meoh698[9^mȶ-V@Z[ĥ.lkTmqa^BM,\@~ ^ZbãL,qd {Şq]NbŅN0bRzKMCjMlZd[ o}:E-׈4MS$>V)ŷ0p0ǐc[d[>8U j~(g_҉SaN ;> fUI2$9 *mLk!pKX;L UoZ O6;$XS7 a}6uXJTgV>@#*[ӳĴcf0X.J0_w"`qDVʲ.#+HpIFkNn{Wjz6劦Wdx )|rqz\/*Bd-1 '`} H OӐxo-,cb)o] .00dqe֠w>1]aXnƨ i B~'$by{RO]92:^x=AM$N66O SmH᳇fGC͹ ќZES$m (RDp& Xzl(u[/'/GQT/sK} CTZx1Ss,QƔXfF J"3BBswicMqI{{VaK,?׾f3K> P9hIfmQhDCg~oU~0w8j)NeuneqNl:}3!PRTJDoKiX`X>%\qFi/V Q6{AK6€ieI8TSnZdiݴ.XuWz<[OP**80s:WeId#T:1Nʐ@pYI[BŨ"}KzeRGJn4M;qޕok18󙝅Yǜzm &cX!8^ t)b㘠Lr6cK4 d^0?gk[؝,O%t(ӈ*6'+eĄDŽ)BSVy=J*&W^ZXd(c.աbo=)u^)5O'e8a%+7uV(4˙ehH wM:Q+Npu*VB TUJS\}1!M\)Ĉ >Ж_I9f|w%S( 3~ P} &sVD͘jxnMieRT_M 4Ǡ/qȍ= 6z6uB7:SW{K(~ie;k6?۰&SyX 8;Cy[.3ZϒD["5QLڄ 0F+ MJÜut]/VC&|.;Q=*zF !hК+=|J>Tx{oϋ,]FMG+!kԴ]˵G=sàoGԫ\o][[P̓.HŦM,v ڃN 8u-aV+2 +8ND@)f)o=鴀,z6;i'&;]?o8(= &A>Ӻ7C*؜OیBX_,/=jQk/DY'wMuR/(Ϣ@(~Q/Fv4(Z+(#??}qtthýAWn3ՀfOAG'#e.VLAZ'^F _.A"SAF|ȢF {t\FgỲFU%0mXGONM՞=ڋ^ar^m+glV4XwD'`0A`?uKktX U&?Y7W?=|3?<[~~=̟?^E{0zT,P'ѿ5;ivq2(83ӽGϟ={GaPOGZf\Œߏ^z7WzzArRnꕄgUxF|YZ_x>G[qc _Yj\qWQ=%7u]šym8s؇vβG 2\#6%ԣt%?l~퐎滱!~cP!9˶~<O| KE?1om) "0vkw_{ѫo!<9^¿m^P[景=ֿ-^/%W^n4^{xd/_H}:E yixDm [n]ohtC4s@gOF/eWVF^'Mh^#4|~oo4\7yo ^u$ (~w!y哳^8Y ̀uka띵w d;p}xDҸFˤY|: qv>/9Οo~7;'EY>eyTF/X?zb׿t\XO_~~q-`8o4=l Bn+PӋN#}ƒg0Rv}+nFenSs?钜+ac7`0p.>1 Lir~ۘUp"`7ק>+>_Mq8˜AZ.N-xn0ף' # :k+;?&=(ߑcOGOǯkwX۪M Y{{[s(  i^%(qr;=`,#8-C!Pruv!&yF1 % !p4ŎZB<õ}r!VV"RR"oDb-UH(Ϭ^'ΦH둳49,WOSz[ˡ7|Ǯ4mrJv#ܰuiЧ٠xܣ_F{\+ IȽe7in/rKtg~ umÌUHWRl9/K˜WlsST@ZtPJ_q$̽rlT T{HKhN _iޫX\Ibt1* BFLXT$&))qesgvY"f|&\>?AFo~T8ᘋe:VރcROnj 3ag0wYLLbM56SE߇#T$QT0=%*6P^nBܤw# g#X3?c(9K(`[{_`*הTX&Hq )4Iı$W_83Q6|lâUM𲅊9`;KHExEW"Bؔ#S]YBDHՑ'7BgmUܣt8Ryv-$RU`x9/ F O1=4w KZcQ]Mp&Yj&Fg %VJuFZ!.E tuԦ\Ծm ANV`|& HoPơ]ŨPoQ̍&щ(SJKJs%i`h{|Uz+ V sS-*Aiw='~B۪Tm,)K^+2):`HA+&*Kae[OvOζnn!'6%>E,Q MqQ1U* ed!{)~,'NAU`_` .*LD WkVQ,pA-Q5D(\H4B6|5"+H  13T$BU>yG.zҭ47_&í9oHj@9c}Cu=BS1Dp1$ξH_s lѮIy薖-E- ؄B,t+[ʬ%ᭂNi. l"\PkX]BqO©4n 1rAC`5V#$zcv: 7aƟ59H  5b%KkrAji*tFP| ? hcrmP)&zsLV`nQ \P?L#.PS,bNp}:U6Rl˪4Y!.!j<c ^^s k|޷TΦ*&R(QElsHā7TS5eXDA[XdWj97sӹof(9(`rL9mmKV5f!lq-8W*s, ~&;%ܝڃRly+>2wkY{ֵJop Cȡ"hC8N:g@CJm_a p!r9UzLjd0zh+4~@ՙgqTiC߉F(WY芚Lz!b"f%J ;6@'ٙ؋ fRrH۸Aod8shd(% 8H\ 3&iXzKo/:sxйf..tpb{opڳa-_43ˮMD?qkVIovϰ#~)]"_S|7s%`!O/7Uk\;H&hͨyqŏz‡.Q.dRp_40>0q 45;NxtqYʒck*$)v1NG&u8R*`^*9a;t@>x;]%) ij6l[_A5dXY+d%+l?JQ&F9Z[$?|ϾKv*>cGn`]Ii.'b4Q'Alnb6aJl(\ >ytќ>NtMͧXYCqeL~cq&TFђDEs|H8:f)k+UOջ:(>kˢOӇڑ̒]MJnj 0ZhP]կ`to?eg\ 7):xKhS/ lnzfieƙL 3`㻽) e$O7&-%G5yWl b&NʻNPQȸLZE+n)hm:@4Հ5OgX)dN<l8xSF-TccYChFHA0+UuiXPG"$e &6b3o-S|"~iq1G3S$!4jo7y'1:S6=bZ0&I AB|{znWT\"o&jUhe!$i7aރlxnY51$]~mp:QE`|Ҕ[- *D 8(j nٔ^Ȩ])dr XOg hN]gZE#VyG:3\ju24™bޱ&eN@Y΍m+e\hoߖ'R(הaж%!#c}@sP͍a^jz3f )T;p>9;aN2qa/Eg=53oresS)H*U9-V A$yiֻ^ǨAqP.1&P8Y3Q ** ,hf ZutJ|+4zTOXV&p< ,1]#|وX _6Sкzגt_6{h(:|z udR[mRKDD {u ̵ M3‘?W)5oG*:"2tH+W>.j^9?fRm@f}꺼`?с=t`6Y5 zUImO īEӝwU2nu`W)8[swfghXo5alżubb_vzGԯ7n(eKA~<癍5.!56!Ÿͨ#u?'!{i1GmqisCsM^<*hE$Ue}eC`.3Wn/mc qp㛈kgG$ },dqrqg<ٚͪ%x_wiuߨW=z]&|r/R\y\t\Q+ VW\W2M x(98'7蔯3`x6 {aȫz3p~b;GsAؐ*y=%&< M&j*ߺ Tá(+u _bu*Jñqq37K`ZxZeݪ ;ߟ. X_L}ӳW_Vp3?:vo`^:X[gߞ8:Y ~D_bvO._wbNի,7qWG?dŖ>huO2kxƀJ_i![>Po^t e})jZjG*F+)AgvЧ\Y]V)>z=Yy3ҽ°V0yd Ty*xPZVUi~?~Ok{b]~OH,^ Ϛ d0]P?Ň˅fb/;˷ }U qbg~:mbw:<=PVf#9ݏqtBet3~dmeDfq8WŒZ ^nR.=DaAA s}{7[^ A>Ftdǰ#"8Pbrs)jB%"L%W_}-^yF&KZ2c}nN =Ra ܓؚwϢ?0o=}뢆HugkuLyC|~R*5S+5mLy&Dm|"ӻyM"*5D,hRv\lɵu%ՔWs Zeq>2Ss[&'11Ixߕtnĺ:0D.npj0vDL)-={+f)xPrlL"IPoq{y#A)5_ɷ!l?05TwIמ{zڲ~|dxs%b+U9"a<0?]]]C3LJ~+k|Ƈӻ}kqg q6x`պ1˳o1NM~ݴm";^Gd{_$r1e.+LG%ި$.ZoMQbexdž|3RN tPxk.YG΢+õ K!1 ^4'ABܪ`u Q`*D(9iY&+@صrUG\:?W/ܷ|ӣԂ5B\C!9Od5_/r9|.M82$ۧ:y*/5e?ftkt׻ N|M@Sk&cdk5>[j<7>ZIT6:֬R0%ta^xTM%Ҙu95.1ʿވ~ [D` 5\>ia64O/Չ~Gub0Nr6:@YՉ5-P;l; %M'>]6Yڡ`10ed(3gi)TMXbv U 3"iRJ!lLB4@=TK~ PJt"O+96&Us NCF#w|~ʨJ/Nh{yn)7N^J+QTXp)"9C lHg_[i6<K(dHdw9P31-RBJuԃ=ZZ)k'3Edx22G֎% Y q0"bj!HƂ"1P!d5i$Ә] d{N= ڧG7B!KMGG48B )9+_dZbj,rzH1"׿56I.;eC*ҩ "H"GJ+VgXHUCL@k݇ӯBT C@{,Yc5c6Ix^$C+! ѶWRKI (e!{$kjO9VzV-i#u{Û%[Ϛ7@Xru$i @Q)HW Flt(&(FgNk$WX\M}G17)gx#9J2 ^BA>cPR-L*d=xKŨ4dj}.9H*yw, _!iG_,c?p6Ic,A?lYӡoH^c2. ə"۫Hd!|q2knL*ґ^aeR9Z H&^hOҌcScAN}UqRW(XuD%)oyI.)3DOC&SU!B89h |IR@gH±*٦.yUGhL Q?4dmkpHjwl7]Pq"2<6KZ>Y0Nz{;FoX&ЧM oQWRN08|)wANfԆ>raj1z^|e>h_]zΰڵްR#<Ģg[MPj6r-|1Q-X,W,Mo+{RWywviyktw7'YOfv]{eVRp(~n'鈓釓J;n8غNWsǻi5׾dASU{aZ*}WCkgh&sX/~6&@.XBb,$(ʷa8, k޻IQiAiҠc (9xHEeѨ}Ȑt;)FrH:2qQ8W.rTO!sM bɊoATh*XY 4OMP`$ .`^†(D씵 mt5@ha]A'a CuJhICIPJ inJyQ!c*{sEN!ɠd8 Nd2l G&+$M}g$Ux*.2k(mU !I[]1M-Yx P|m_A`O T 4GA&RE-e+ /BFFx#o#8~=$ljQ2{5щ*d'H_k~Kc~ H J-OgN]mR($dd#2WAX 8wHMB9)$P#Q!IC dt*;DE*NytF{Jr1F{BKBk>-+/ Ѿ8]ZQh,}\l>,CKBkbZdkdZѧqhei1hI:\Z@cвʪxюLzCQo K(d{*NXJ'`ܑ ȐȔ*ap~'M =a%%+OB:DΊr|IrZXge;t!U# Ѿ ?6?@V$B+P!M2و&D0"I"B7wz<@l$AN$HpDI? NC )I $ⲇ F-6/N;)mzcZ?7YMp*QE~4 8uq]%>|v?GW£+Q}Bpl)n|5 R<"tm<"5 l;D$Ё[Y\ϽTGReGN4*igVckl/ :b &,RG.ibs蘓0*a<6\D׊}YD a [gp=gPW0- !H ɐ~~.sg28GPS ՍjDaFݪPs{%F;3%@ &%85ˤˌh礤+Sce4Qa:#;r>)|.p5!IZA,?b.:9ON :οBo zjϸ9kۿzm%mnbJV;]|=&}{'XQ|PɪbM+$Oi;捚0>%GVeUbvw-sVdjņ9nuȚp12 Be9ɜKFm0 m!J~1~ W1(9Non7վiu,1fx1* ~k8~OR(eCW~-з/|zsjủ#ճ9}aل 2K'zͬ}>,^0la[\ct3yi YcXtջ139\rF|K =&Pv|S:h3HˢҰE(~Z1yj#',9)1o3|.Aڸ{--t&$;OހqB#OC2%I!Z$O׻'B+#%D$ ]Z6c˜֓7cZ {0Ad*)dH%9cdL {yK~'R`.($'lpIɷcNBomoKbbt$ERW€\cZ1yԼ£uɶxd!SHU@'*F'&sɷaPTV r4tY(0+1=voRRJRZ#SxAVH""3zL=ϼ#+knHd݇"4=0P)"AV8} hZ~`Q]eVVfV KVh%#BO/m,֥!X8y., W[m`|Xp)8POWs23=?\~K>Pw eݏgEW>L^{Z %Q4NW[f5W#dZK?r6,okՕw=[59I k fa2^秳Z?O[$%9wnIAcJZ 8Dae2CQФZ6vI?+!O&\ΧߤZ$KS " 18g)|5/0#\ %|&j />?Wv}:,YUf>3\٦B{.fA7 U\Ԋt~{~RB{ۚAwTcHO O,qrBh8|4@<ґŰ_`IiX}gL*َ:{ɸwI "-x[;.ڙ雖pYUĄ"=&:ŤJXuU`SIr 8yŶ6 Ls\߁? L}*UjiJHU=|1ߕ J(Gr˕@i5b{\ XW#5"߃a1뤌UE#nz Z㿯`-M+ԙǨY޹;7yLvB~Y_<|n>AӃI*΃BVRR3SrS'=un(OŽa+!Sg-e-@[ b|OƗ  YMg'Vc$z(ߙfPu@uûCÙAns]3Zl(!Z9뷥"HR߂{a&W͋,L۔k?7.}HzK,݇M{(6^=dfIT}fbfo|]j.&`} ٤OeC( i%uS'#4R`W=RG f:huҟv1y3Ï?]ߙb-HO'R=-Rp5= J}.Mg9RZқ{O1 Gًod3Mx~B{nĉvWXcrD[(V,&mΦޘ"L?NR)CoTEy\7 =My2ReCN$^<:֥kTeZXri"?L$R(3JPS:Y, tآrs&5ZtC!Bpx1BQǧZLvbt8V- M8ӤeG52Ђ _#0 3=|}T߿1"-M'n2@ fg"-OFou#v'S/woeHHlPvCU wuFl>J̥{y ݮQvRD Dz7rey; 9~sa] >Ep& SwiH8l=f@Dwra:0m λF3ҔugVJ]X%qG bs6͛wY:3L:V[\ӋF Pj]d+FuMP 3#`9FF#GiJY\{)~bF`u}w[zGTֽZ?;֯Zc:\ʩӬwlwcoפݨZرdgeӽocf\{Нn|6htt4('pqчT\?^\zj!D?ekt gTD9sg"d'FۜPO㳛"d! 9+7eP4y&mt<کSZK\R PjлN9R{ǭ_ZDt)+|(z.ݺbSp8Wp._/\A Lbb [nn͠o#A`NȘ׊G`)PXL}aJvzLz'*jgy0@II___L|l 7xw艦!/[a>z'C5Sܾ23PKcTWw\K <&H?$l3mݍLUue`RIZioU^/a%/`a91*̖L)jHѬZՕ`z=_nQFRf m|P"`jl V Txl\$k02)ŎCfw-c0RzM;Fh'R* Φ<}Ǽwt`ȹB1 )wd/ח^42|g+oW^/ޗ+r[p"՞ CCD D^*-!z%8"k%ٍYY'ŵQU;f*u-Zz(g;zWl;s-.llҨE l(xM{SJi[ؠʃ|Kv+]V$Z;Vo ;kDbFtAZ<zBtr\oth>S׈ΘGך֯5WjM{?ٳfgB5PJ9 k#h<ɵƘCbNa0c!mP1`kvv Xlе"i.*THP03AZCp1jPMXYJ. bҨP/|gTT褻"o冯mR4/Y%k2N(Y?ȽN#Z@#j0F)1yvK#c̐3GT!W=Fl9D$nfٍ5\V2 }l8G }6`Ia-rgv9yqFm`!Q/K"ݖq HmbS+Ѱ F$Jh6HS!K7!UhRohM!90:/tbҸUl#تlVѤ#(*{n~uJϰ 5&5#6:Z~VK؃ҁL85ZlU* -LX)8!ڗY.he)t2XؘڑvSĆ Zǐ[x 0 [JWZ[-Z#EGẼ `o[3x;cQ(S.L[eG m#l9?t{BVc@+{eۼ=SÃ@Q6_sb?IwnlW6gMQ-9%0]]TsiJE&^/Bo/$u׎dcq^V^U"ӂuXcwM[$q+QJ* `/oaJVYYٱST߆e?9f᣿ݔnQ> 開A}1mDZI-{M['gb&ݬR++PM[IZ lOmD؀^i,:ieJs 5& eTy '-p=plr>uPs77ORd>@a@H Gg@H}5zO|s)$z2y$uo~mpȟExd0 X%TƠ%5JXƏS 3)4VYSR;9vtyoͱ>@u>]JoĆ( IlLmu!:+?vzʨ!&vꄀ1@y  )hIXᰡwu@h[XJ{J:жV8} Z:}`.wClȰX4|8Km4Ie2#ߘI Ui#zٞguݕ:x%JKM9?v}RwNg&] (^aXJV*?Dx+EF,Fa8Dae=1n(GmRBOqv9NdPRW0.|7(-8'%jJ9?br (>B`SgZ?/0!'*z]wsOYe"r>BYkn} $mWʄqC}0nhvT-(;rT}pP8d3*!j@Kt=}~!5׆ZCX3f*; N:BJ"Sf}kc)q;ZX# L9"pF D.jSTs)9Պ` P밳`:}DIo#kC"UcmK}rN*!Sɖ4J0kg= %‡ XdV8FP$P jbtNd,v(7Y˵N[仪5L,b!XvR).61s=Ս 0m~0S`ynXUgGwpuR!|8+>A?S"9^آShcV[NA59Qt ⬉[> kNF[ǘI{W5룆ഒ)3>>6>R=‡|}Á-bD LXhҮ Ǘ> NX c>g$8d+ET,^v=UTdb`ۯFb-ZU j˴TܧJ8Ez9-^ Q iɒ74q4`G5Rtڇi$p-0] b8YWغάJuloS|+z͗ulh!?k(Z^3=ppˮu^gƯUsfʯx r͞w&(TLPPldhC @5yUICFehTnim Tx.8"A3=q: p@E"HYD1hq2g1CɡN;$:WXMX*!$˾&חjA{3?a-Pߌj|%p֓Uޙ ~Djk;bbUz4/Va0Y<,H[q75ësٴa;3+QN)9C N9SKH`vbb:=36Z nJyAm/V;AJ&B \& `|tRq額K*TˋkeRia/'hzYuZ&Yě,^+ظ{u4_ƪRa# sA@ 9$8Y hfyFy N>~F}Gchw%vڲ<*Vjd=R *z5r:$W@}fY,F6[6:Tz͛iwYab0^fv!Mg|wXWk{>Oٔ~k*`\|RĄײ P*hfwfS"DʊG)OΝfΐqiG84sM9߹ Xc(f8Jwta4$4!Pj cx|5:~DXI)^_NE~}6vWټ%9f -[}@e +gg ܞo [om;&AQihLR?;nJ1dCB k ֩NuIlV`"3D@(c!0Inr?nn.RmH{C1'XHR#/"jZAJ+Ft4pDR0?(X#L>0|55 !HeJf7NZД;Y2z nO L1[_ h^! A8NY ǪRNوAe%ȷ U>E;TD FYj1?!6,ocaL *.`!Jθ@\ Ik2nz^_Nq6ǩx=N1]6%zmCEwBKa X"cβrʕ[6J<96D c'vu$Q88h (B.! AkCا r1@Jͼj Xmd Yd vn{Sd[@j_#N~VOckBijڙq vf7 b_4שN;kbf5&X.ԭH82gz躬YPЉ)2Ĭi4ܴ锫@`zEHӫ2pfZzA 3zU#ˆ,kůNw9w],1,Dl\f$ceІ#Q;=F(2@@W(NZ$oSrA^dKZZFbo56Z눫UNݹ-*4Rmݛ*\ bwPLU. 9nf>OnjgLILgy+\?N%xMzz{q`R&>^vn;#\PZ =W̊TgZ p8REe@R`NeK5wE2vD+V'krX5(*i0ĩ3&-Ri>ҏCJ:'n| Ls탋 U$mMC e%q0Np%#2ZDdZÌ F$H۽ҭힺJ] DJhJNn'Oޮb 5Lop&$—s_c-SP+_oXASj w̦?<33u.a~h8kJI{F!簎+ \ݠX=W޷5Yj p-q q4kńxfl *f] Rk@uHz@EeUT (KaHX]mZGOo^E7jX彸HLj njҦvdH%®&]3-x'TloÃ9=(.8 nYrTnlH+eĨY4k` y$  !1be; 0ўImu?4FTH* #sD*ÿ!JS&cTx"O^VGT3H ?JrŅ ƣ1o\ˍFR6Uϟsg:0g@;k,jFLLx]2u06[; m\+0 ap+DNB1L8a/E :bJd,pLIiLȢh1Գ.Y+D@ys,&q!Q"j6[]'yGDx(2 +K$xJ@SN4І`8q,F`~=RZW V hV%ZJ|0E,UhXݵ~լRlN7[o۸XN+`[r.# ȓRt4pEw2GjAn쪹]>ny'D9EXY$D><WftEfMg_Ibx`GbK:g.WGT,n2۠{? @˧!rSrw6 h0绹gsoК.û15EE Cq81_LUӪyS5-?BLs&W+NdHiR$/JDug?7/3+{:~31N8tZEHWi]1s&Db9[mdJB[mNϜ \ӈY2"it.?]n{*×壃Ea%ћ^h?s _ ,Yh<~ /->:Upxײ l5;<:(YO,Q d" ߲eoG/_=9yy飃-&_m1z.EzNoKW bN3.0o1>ʾ])+oӻv\g$( ]*G~a ΓoKR'/Og/??;y|zpx:7?ErmFGޅӒٱN/'g/O_x3\$}l/t"gn@?ʟ~evO_gyE`4~]tm<=}r˧>=}}9/ >9;˞xӳNϳg?d_M+}|^r̭6Vgas4fh'HIѻ %Q)II"U4M-[4Ah[Z7HUG ANl5|*e=%{gEG3]u54f9JJOk?|!;/R2"@ьK^'hu'h1m<\J i--_w6FUm}Uh?/>-/'8V=kA<%.Yvvw2{ez"?}2=֟XPDJszS RVW Ÿ_]|UwB.2y0]d-sBݹvc6J#`L~g:ќd/=g=|q1?s-0^]V^ub9ڷKp߾}{f g1 ֫8>%\6tPqz`se[!m$>A`H*0&)$N>}$2JrCy#(F5Ĕ}L@SeDB~þ(zXRjdՇN!$s"Р- iPN} K T.H Fr>o4tND m NaɆ׿OyׅO[Pzy U@(*VI1}z'֚v %9*m`>$?ʍ"wcĐKPek Umy_ ;o,۳+ʶ.1$TԶOZsw=&~Fj w*d ^V9}T}.D%h0-*.&(i*Ns!32'u[8.jQs«;b>ɱ0`cBrs̆{b ehdޒJ@*p48zD+)mL> JjeۨrYLn{L@5o ;uI* lKl`[ּ%a[$b;FAE|{}Pn~dV1ʈY7Y2As}yp ]_E sߐp=GxY(WQP#*˄QvO+ocAMocQsڣ.~ic#~yz^R(g/57rwk~pi W>f{<+Zykn mixP cH"3F[[W)1j{w=GK"f~[6x+5)cG7a9w5R12"!/Bjv?؝w̚bb2%r61fuu=Qx[nQB+σ:nZ<2w g2nXiyR39n>+jdAUB&R9M|ՎwʹNa ڬ],~dۖ:-7zGCp mY}_Ӹ̓4oge5L$r-~'?.IUۊTEw~9lL*&pzՙYMJ$φB%"nּk BǸK)4;x#V' :yI9$1LsG)yޞ)r<=qk{@arqlo3fLbqe$փ>PBDim6'KF`6 \E/E}ۇqEɉ#[km l4b|3,D&:7)!ΚQƓ$+C@T]J4IgQ{=TyJ+4 ͻ:3p?F\aQrY1c :59ϱ׸POqk꘏4w9;լo>3庖dM W''KSofAFCICVyJ0@<@L $w%%vJxS|v&ӫw |Y7G_k}."3 paN\,Z?++x:]YIMe13l*m)zrݽn=6ugP(zmb.Zn>3PT@_)YԼM'"Ɣ mF-I\ %nb#=^kxzrt0x ;_1f ׃^wr#3j]rc˝w"7#[_:늽lSy {4o%N+KH186'zW qr^!/שwpWl{7As "݃3h^\?.pPQ9${:{G4g]H<|v08 3DY؀gaHICYnƦUlZ 0/[Tȭj).x*L`cس0iq0G3`k2J!ZVjNvmG,ZN>&td5tz+Y}OoܔUi2uoB$ZHXp6ڠegX,J84l,&0ʒѺ(D\ 0pr#@ kxRQfqGC=;:T$7~|R$w?.}s /.֫o& .;?FS $vv~2ŭ*9}@b!i`JK#(cqHoS 'bLg(B=T(ۡ}^w+dgO )Hʑkq)9G ""!d2P`FFKCK5-pý<B- rInK(_lhrRʂV4Y&RK̋.o@!g77w;|`TDn"Qpc 9is\CR,6l+um1 \lH!I.7, YtH-pˆ>yH*RP$:@44 x P؆\BC+5?Brz hyhEh\`QArb5D'O)fX3H(nH2G60٤::!%V!2FJ$.}C$6F~ d&Au6m ^+i֬x1ǖNJZZ9^ 瘼!&vi1!.٧qDBlrHDHHa X5VKhF)B!MUA'%8ElTIdҚ-E"B栱G"<1:x'{/N[OPIvvſ{- RQ N {7TNp;! 5+%N > \8@B3Tqz֣* J90;di*DA qG >Xuq&9FҘ`L3:w-Ý%6KQ΁CT lrTDHiŤ՜y) ?q%I2mΩi;m1x\Z(mg4{j|ŽG:㿕uV ˩I5P( ""sRp2h##R҈{-'.[wƉ3 .m5Xb2|7Tֳk19#DE VpޘʐWPqv*:n Hc\GeNL@- 0]y B7*:7:ꃅm m! (A_ P`eq GԊ4"[H+T8ˑ^jQ TJ+1ZIZPP h}:5u{Ik^v쵙(]`BM*TplP>0A`@b!M  t@s!$PVP8֕hGF4J.h'U7.`t RVwz6K`;$ZBHE7/$ƊH*dSn 5 Flphh}H(xTBll"[dBѮ(pSw+NJrGc7X3(6  AQlZ%?heOSr3!('nDЂLHp4?QjZ =[D(KU)&dZƼÜ&-j\/z⮖NW%=\-b@܊k9y! *!TLdK*H%=t1yʡkȹFUQHK `6qO.&ϛn&OLBkh%sAd|0'_CB'לyEkq6EKd{{O` N >/A&ٳhƒf|{Hi$n6Yh)^EV+v|^^|HC,BL^&\˚ piS);1Ň0]1bʩpS5$jH-ǭD]^|H,^xKj_<[8ײ@ẉ_ʎfy&^ǭ&O訃ϵ6NNǣIDY¤4P swbz00di! [TLJ>:@SA! =k&ʒ\;(פMTO!U C-*yxog:G 'G3< ` wj Ap U׳`_ #BqfNSՒ 0 L!o{k|]9=.trzxt_[kgCVvt:<.ݪq㏱ q) ?T-8[, BD'v.m*$v^hvBB>q)iZ1gTqN:5,dH[zM^za3oǵHd.뻭1O(J  IQ^H -BJ?w rNtH-J54&]Nةf?/WX g LF~a>w.WѵBc(zHi(1Qଷٳ=tij|~0%/:ecVw]/Vw>F ZŖz1{>"x\}mX\t쾴};j]7%#a|ZW# 3c+Ôx77Be3y(lJN|ߡY8h)Fߨ0q*\J N{Qt]<4t ["YLQLHi53(NRkt~Q~;۝zx@Y[Yh•U EduZmZ OhryN- Iz64qiJ!E NM\n¥K59KBgSCܓ3c3jK09UYӿ2t1iZOvN77ovw}?_Y8u5=B]5?׭>潞]Vu}lOmw;A jOɟo>ܽ3$;#I4y,/֭~yNMDW iW>lmYB, Bc肴0^~JMFZ^h(hLBVR1D7ZCG D?w;*Zk Nt + }+,yL@R BͥR6U`Y)pCTb E!?i4y$A5l9pA*TN6T Fj81WQ4eVN6nFՇt=HAĖB~ $g(Y!(.c nP7$h@U3M| *t!c+-*52)0"ΗR-tBtAK-@,]r+:O_ш$v4>,EGΌQ*DKY[ERx<.fxG:ta R\"B"yE0U-)PH ^v!?d- qpIkH?iKV J]me5CtʂkN%O(i̅ i(FD))@5;F9̖!$(=K;-1RPehj.ᔧt3%ce%@] h,n$À2[YK*?輊,;{TwJ#ðH&IhZtP-VIHjiPA K<|= dBz(qqyvt9oh$!}pt@X`!K$$UkKH("e>}<"ЖtSya{}LyV-] IYx6ڼEm<{\{q0ۦ[ !jhx;]G;u'=t9ި['Zn!Dy(H(+Eyb 9Q) JT)/{ԇiQY@-;I8mq_Cdy1` ʺ5"\KL,bXSU-PRM$*}%43)$Wf|gH;!.2b*( ~*۴nF~Le2Gkst:4;Ǔׂ r[1|sĜ`fq%;Tm;9NOcGC@X?[Neܵ`BQ.Y?ȼ!iC2"Ȑ0'be ,/̡]AS L>9%5Ҕ(Ii"Y.S>L&(wMWܷh-)qHK-)Idw')q' BNR"p o%[`6s9q$>q wgd ̊A82CV=ٝǧ1Q" rEh½QeLfۉ'1 pGMWi>1,D0t"&DtXŎ%r$0ӌu^9|"%SvݜKn41hgn nCH'.Q2i?&RgC]S% Ls[3T@b3u4C`ϚJZ:9:d~Uۨվ7ncB8l}RpZK;F3nĝ (3n E4JdO M &a5h+VPICA{'_0P:b֌(G̉ N_>>x<㞎մ4n+~̈́21=:AXSXf\F0*,ᵻ~LE{w_|5Lz1 Dߙ^No瞟;ݝ]#5-G&j>nW{l9q'l~}2'׏d6#Vkak!12 X@dk&@ mX:c `T6W)"wݎy7Z~yR7a?aq_ re7ںΟ!(#4_8Zhs8yYb+1b=SnO;'[<5]E(CRU~y&6e$ |.݉ҿɒ̻aX+8b~\M4pHay/' c=d|&$#~GXjr׷HH.Z⹶;EvqvXHYa\lrbpt`!Re<9-=]ŝ]|^TXv 3|LD1%Tlq\\ص~VwAN[r4ÈrXCߙ3ޙSלAμ|/:yXlvB`(s b G1ED?`y>!1JO.NM x~4lzHN4NOoFr yC2:RSazٖ&2 [2((q))"|"ksb}ǎu__jf(sMZ7e֪@&*wjB ;@;eVBݡP R-m7PG5(^36l'Q!g!v["';_m ūsBX}n~ HB^mv)F_On7,kG[eA r3΋YT3A/)ԌNN0H( )66sU1jF $dYϙvbv"t"ՙ*=UawRd2]R.4Gw`&j6je W\qX{ȧɖk$"JeA^\Jw61i,1ۂ]3.h F;+hF5ijw#ݜPʎ00NTvD+$bǴڸAQIAH NO6ĈJH9x,ة^ PkpS:StiKk}phD6L*6Mb H8 ! KMxCqX+䵶t!zţkNK V dPYʳ| 覾b~*ٜ&|{$a:M] .™ζ( 8Xw'=|1<7sX)ʕFKU5VTF"R@4!*q S "U_ VM1B^`Qض^E:9 u?RTL8ƊQyy6GQFJ$}!C=ˆq/#uTB`E0;~9tϯwG%2=ysrA~{}m6r~C<2fptfHO需WΙtBh'_sýS~ͺwοx7B$=J1ђ){>YbZG]hj[mB/F6Yv<؋W Lm`xDˍiC3Q 0??|EQKQQtM Eb7K2.@Q ѹj9Bu<]0)YdEMfONṙ2)D_r uu7^n)̢Z~@g ꌨi*y_+> z}ѭCJ >vP \B#B)g4Ҋ pok3)oPT9ɩLJAՓRU.1=phԨx-c7Bc.(@ -tqzpI$h%g15['FzxL:*YԃRJ5@P@2)L7@2DCs+i`Y3ZZpU<[X7h[>fI+:NIbHmTX[qV`t+-r՚2]}q-SF׷oF}GJ8D~BM9eiv2cpH"Bp3w@V)m7= mw[<+pٟ&$;E!zE%QF( ҜE4~cUa7We8‰HgS`ZmGd(B5WݙrZIN{1g9a4 CZCqlRp RԒF0e a>N rw2R+=ܛhńa!69QBBJj%k5q!50&֊ 1įLЋrzҚ6LZ05spޟ{( (@yʚ!u@ &bA zgoSOKr{[AhsHȰ#a/ɃK;5LƔ!T 'mÇO_,7& y\=36Onߞ0H63Ŝ`8-7"< y0aҊ '9kyѪU/>Pdit//ӽ,5Z]%k[̗ EnD%W8ROECpR_G?m(~q>,.R|oS\Mws|whRlg#켷Q)"OEo`Ƣ^`6 @hfEvJzeJ>TE 5iO0A_q PxWIjEPt9{' m̌*_c'__qHf˻VNmΒZV/k3ͷ6ȬϘsl>#Ыϙ%)ia;XRMlcG!N^| 56})K8@Xpmi2F=jl^Q>Hn>g b4f `97צMY$ߌ'7cukUUGmje?NdcvΙ扒/l%]:.ivQ'61C3+o^BL%E00PcRPQm&cAj)04Jt ێAR"X 'P{ +AYRD0F$ km΁@IV[Y=t{WADuv'ޜSrf@im;%UϘ5"!g"w朕 nK16 j-H )R_Mg⬉[LmfLup-ɣjy(~fPjĦe6c5n2`lAy")gbDbLcFMZ/Tsd)ܥuv {dGR $#^[6TY)2|E܅SB+?DC:m8"]*V@ #,KҞXgq6 G"E#Z SRؔЄ("2`.E7mTZ/s9_Di9";Yt$:),uX` u ' ŒiS"Q.Ff a'0UPN EX˓,tÎxs_Ƥ4lT~ #f+[L9%zyU|esoRFSl7OjfBY4B\5(/D4T;ˆU*,s<4#*sLKtU%Ba"MLJ(:GD6M-:ŜKL1NI$$2oQÔppƦ6GIQEW%,5*X d?"j>|T^EΐKF +t ș&7$L& {<-X,ȏ}dʠs$ڃjOm~ԫ' pÒ($̷x3A=w+G}+맔yQ'͡ӨgR˹2Ȕ2OorW}S hvϼY|Wl\'W|_m?.oW6e߆_,`kX{O2룷@ TsE=p[.۫/5NJ7{֬\wCk!s-)24n0=E/ҭ)9킡S@^ҭ2it!s-)IbE V-d6ÝUDO_~M︮ED"J _,[?3 zuxp{m1=(A3:D*-+f5RJϏ2sUWJM5ve nsochX,WhJGBZFN$h1 EVۍ8%(1ђU*ĮKs13]_OJEH"=" ,Ϥ'&% g*~̴V޴dFy+4rY1^0KK(8)6q֦0R/L29MѬ8c✅oLrJVktGMS1pNXޘ9{&OQ JE)Sy<4De?BIQhG9߲3nt.`|I*Dzj1S]Yo$7+B?-G~`,vVu %JrJU%1GRV*I~_02 EWl!ksjuBiu9#FK mn)wu6O2`@~ i0Ԫ5L_;֮}1㢠*&"6GUQM2k @qw` <9\퀝+vҷCc ANLv6kJX@>Otm~D&!1UxHkCxC06.>)BX  `r*yv]/4Ϲ !]0%ݭI 1zY#iI#.F/i.A@A@vք$S ISWߢ$|-^ `݉];75sxoFј$mKT%j^9y6KD2ЮZh}9ZhReN Aam_OPڮtdgdR/vU66ik6ˬ*8WE{ P.9 Β-$')o*eҍErB) (Ђ#7т#'ā3ez*9itsJOc=1y\uLc(*tW3 AziO-fДq!yL%QkpO !^@hFH'!|w  \^}Um<.?Gkq$eđp ,"1iAVO$K)̝J҂DKGVhFԘjo֌QŨ$""I.OE' D',A>o 7V¸sn :XZcMxFQ9X`tk sV !?Ga}( ((FJ Tc%"J"G,vI$S\3$F)g rx( $bE{"Ih&ڨ[Bp%-* (X PaG"UPI $R m DsTN"V'l;<-_R 8`Rׇ)$"HSA0hdTHU3)lF~@< &8i;2֫E.|ewүsƂ17!̎m`[Db p$-PXq p8:&gc߉a4<nNYQD)JdfzNebH;4 [WalG/ƈë/W_9kϼ'?+v8|]tʣ }i nqV ipQ,`Ny'=}sG {fsxWc<$^}f~@۳)[9882nGv/.޸/x$2.9T_s}GtSxOStoFMkNc̽\SYrBQ]P$(f.Sc-fVw̖cKU7;t̎MGqӽMGZAP߰y\/7eǷ\ pwyg` *xb; =B > a'CGB0@ds?d869n:>@q) R/yK%P0ֹ ImBк)AǖK ڌ>=lqa7B"{h睋AJ1n$J2JP;?;w E4KFG;[vY3 ^cRd;&W;-l9'4l@eDtLA+ G Et|!@ܞ.\ RuShtPٞO I'JH7.92%Ft{;M-cv[;+(sv^hv+!!߸Ζ)cZmD'fBӳ CyBVP2*%顶3$®)9*5xNQaYF'}`!8l—(ubΟ:vVwZr;C;S+7m*'u+LNBMImG݂9`ʾ*%wK;LH;2S67S~seM{<娙H.ل$3䐈D>l64E|sj|J/5r 3|zp(h.Vjk;%=l" e<;2nGdF$(u$m1SpilckRCP<:Up9#ܟ i`@ą؛qR`ּ[_7\5_0H(!vrE;/&|Ю?0gww׷NO! <]_otIOAw۸xYXojߍ2uD<|N]}Յ ߷~[H)Q[-<ޮbQh?w*^_\\خĔSchI tQ&ApUݜvxqy9G‹Oyi)o(x<)x 0(x=0u`hSPdl7A*s i5ՓNBr^^|J, ^zϧ xi)o(x:>!(xáwAoA΄?>XTyˋOyC"8y[-:f9O4; _,!wxLFC 0FxͱX14 [Br Y$k֣ć&LX[HH.I%cƐ7" :[o] (2uٿw;KԂ!fnϘ5 , (\b Ʈ9dte^F*X^X2[] 8tcA#D-J9zꙮ4d l]m|m¿A/ٲ+u~rgOvt::۰l6;mPl7,c* />4ŧIZn&M&CwZzcwL/:vciJO/XX۰AmIGVzc%Y鏕=DZhf`#Fg>`p90aV:gL@  ^H6䠹AV8V^!?GQiDO%%`~)_GYDxh%1~ޮ%r[Y} ,ywodzir+S狷QWCyu Ku^}oiF(>e%oZMxh Q@`eH0h4X9s*`zD,Is~Kk4EmD2@0LkB{,(IX&Qr[s{"},Q턀p] PVjrZ)Tx TxάfK`0|97Oo36,|4 sNa3Z%N=D>¨+ž[Iy9q;3hM#ly&A<ڜ=shM!!=Q ϟkN 4gNBޕ,RJLP <򱻠E;e) eۄB TIj}t71 }"jlU1)}fM40XhTcULFxs1iuGX\}uۼk'xn!=]Ϛiά9e=t GS:,w83ڝhډ|"%Stw7^Pb":cLnK~giPoʹ[yv+!!߸Ζ)Nbueݧ2CTEEGH,6vRtM't2fIʎo*r T}o20j#{5k6gݦJf{SQ#+ :MJ&dS37eELHﻥA_z(LI ks+BBpd3V0#+ˆ@v2t@S cc¬bTsŠA.ȖP ?S+Y'$[b۲HKG-BKq$'Gz-`qǘoboD?V8H.OQ|0sh?5Nm'~+cGlbFЂ?tS16XM=,ϟH>o6 G lu9#4CLS"$W!$u 0Йa0Fϐm74'NuVUSSm%p7UǼxC`!yXvmwKŽmp2$M-юUbvO֑ؿbyo `&< >ؾtgKٝ$urzЉvH#(j}H8_ۛ8/F|1hxI0 ر{teT^QŅ/40a"G)Ǹة_ۻ:rBuSg RqNtjKLK*%(y<+U&FkQk$sP,FUs >hp>Xu\E Ajj7 x}(aĐBT+/ͳD.ʨieNd16mAC@ 7O̚]dfs(ElbG'Ϟ%o.6` (SvZ7ma֍:KC/Kl3FVtukNQ6|MzQ` ucce3kM$.v#+Ժ` (SR>C>n֍:KCԣb ]jZ0S)z'is{n9XBX~Su#+Ժ` (S}jun9XBX~US.FVtukNQ (ʜu3ޛ:KC֯ 17~nFVtukNQ}Ժy8 86KLĴbmo4` (Sk8lX;nVB:Z7,zm89֭_vn]jZ0S)0nv'ռ9XT*Ijn<S'Ȋ.mNQ͡us=X7,us[`QṭYѥ֭C^8EIZsSf.b7,us[[S6ndEZ y%q `7 F/̻`QR8Yy~S˻Q]wkNQjw,X7f!9XBX](STRւ!/Nnv6Zq@GF%ԡqu٭.v#+Ժ` N?0BL}|aΙJ\e_s{q8޾>/^Cwο/֟2}׵^>6Ay%/{{{/ʋq76gt{50.Ӡg.C6([Ijطh_]К3llmݫ߿UuЖ|wʥ'<6<)ŠC9$rVR^h:o|ʏCޝ_?ǧ/wp_.~otp=y|]uO _Fg )r~=6~S=};qq[S4Ӈq>5W#C "^Ɩq_iόzjܺܘ{F_pgƖW'.mj'̺+3W'Q̰0Q+:Rc?o9X8c2rZBrZ)c%H]{ EQ_jGdJNk]1-)DB{.LeA NQVW L|LQ06返e"fEV'Zb3w"]_R݄5 YRRy0H*stEz/QyL\z%)0$uH|퓲3YFyCKOZ+WxW\Lhy|F;h-#?JBdܢ\I'4Ivƨb@uuRջ1dA;[ \ VlɩL%D)J0hSHePT}/IKFC|J+m&qhdT(2E5r F%ehq“Ida!) ָV h$yB:/6*"+![) YvRռθBŀZb|@bD'C;}nRN@ 1d0 W#.n7kb? O=nNW>]n?}TX\+_O{䇷6vǏFz5~o}H2e>Ç]>]fwJGV^09._[廛|s ,Ao1Aៃ +^jVf& kxl' Z-/90`؛FvH) MqBOTߺɧoą2\ZL?m@Qگ!;4eTCǧ23!"E';Uڅ\!^ gV&UQxhH^`TtdF~.u4%N)z6XؠRCfqq1 6Ȗπ1zɮ!`8`zX,6D./ܑaD9"0t Lw p$L =09_)f6yjdl<_XdX|L[*&}5dt &5(uOl!&>/J H骱J"R,F +Y|q5S0,k(z=dRacȬ2:)DeDئ yK8 ,0(q JElǻ] p8ayK0" 0mGBd#GPN&J[#i O×O#A⨐l $hAi+8Zi&W?ޝ7n9-埂8KB mN9O &QIOlw)8h5<"eV€bIc@RrS<}h*0[0i]L2&pcUE(v>:um5 Uuu_ECL\ ԧ6̖PH>X[&3wÇ_/O=m)|UΉo( |nDbq5!iP(]bG&-hno項h* OFb,brm=^g#Ƕ`v3иIY\c("FS䒚&ߟ ȕۉ?{)kpzX7XۢUuMC4$Qu3ӯ޼ w1h&JHe( X& rssԈwcf qؗ0F\\h^JȣN$L83=!gk0. vZ!Lu/巏(]' m VcSoH>@@.R04 q$3@1h8/'{XOTrJLau?kX @ձ6!!DR >3m HOz %MXW8.O!GNdS$k[%"U:9\* `P~AH-cSAGkA?j7I呸אg| f1zGWkʾWk&ٽ\;#c3*5^"|{ps8]ZtY /@+ߺ*4ĺv =Z&jk;F]-`ݶ`jsC 4$ٙl>BTj;TϛĨ!Y~Tl 0cځWj}o5+vFed߿U6*x{֢U *7*۷=Qک=NӁqL54nR֓G\903jEVg?/T7TD7t4}.|ڔui;[2pi<loCd:pәU-7@}DUB>:HN/m@mTecI{8+~ېaqB"x8[=!E=يآuuu]]Чaw!OYHzdr_%t"';iZ8#aVcpŀ o3CpI7u0bd˜4Ӂጫm#s"-9ZqCu.?ڬ@D ctp^6\6 e 02>p.c2T춶<8]@KS>v~H&Ҙa1JSObfNq2Mx B|0 Sdi|!'g`JK#9jSJ\cU|k. r"5\yM8GE)giф`6̷?b }V4h^%ѼJyU t83)JeRHW4&t$Nh(RG%ӆ=G\3ad9뼸NApnm:/RXl1+X0_E(WAۼ\ ECz LX!\J8t妤)/~ pRI4zҩx#QB,.DE545 '%G9 ($cW Heچ洺o+-P,_rg<}K~vKC׽by{jՆGI×K㞌m2L2ADt݈$VGmL)DŀZtZ.F (%3# (u>ؚb3i3U)|ԻQ Q{Gd0lFyE`"hgH}Qr#mH2QJsqs52ऺWq)v϶,~]=]hJXVI?| 2\h NfO=vI.>+ Jf回MC_%G_5R]_@b̋hvӶS`./9V 9߮G D0k9AiYض%JP}-ߟ_"֗jΑx`(eJ 0z{?k qjb>Y*^?K`R%Xɻםj~L,#՜YP.޳oX71y͏榈u*wV$I8;}gn:o4&5ep| 5JLOsM) pM^]FTɼ*s;t! T1Ŏ-(Y\[I_Jb߱mSԬr'۷ \&WxԌGxӈ7 ̹8%raYsK_蕥yEo1~hC8ʈ8#88BJ8 VhF$GWp& +e­C.rZNa@#‹h4sQp  ^h AÖBFi 8ю]#Q.I(; _p<A2%G'zŨDSDqB+0hQjOˠ"xⱗLx!z tʉ[.eo-HsmBn[)><O(tjn+wLj~>X ąT"/ܶ$h<[ޗFap۝x$nyF &ӦҡG@MU' iv$`꒠Q LJ^T_q^_iȨ& 췊5y5>RkmHiOBINH'Hf'C|_>1H%%~^t˛WF+fTK=ՠ~<\x1Zu|wC-b`P}s Ͳs۟>81QS'F'V̇eBd\OHs`!e_0;_^# nJ52#ݾ&]:Scgwvjgv[XXOݖJצaiy׬Hn7RSStr0*jѿuit UmOS,.w.V^VZ9/?UUfH&4a+MXgJ,etG,]EU n˛tg7q{ɊwQ4T0脙d1-Wm׭iw R `d? o1s4` p<h\_HEQ:J5ԮRY,誤 sInkI>W;>J/ "bFN#M)~ٽE1۴@b 9N8B5؃G S0wiX 0#?4bn'Jgp G?fLXR)5_v?\ 2'/FI}(ѻ 㔿MgE[ c I>FA}"T?j4db4_:!X ZTz< |#I&,a| 1Ky" Jaq)B0k$*c r1h`5N5fLRA;E$ QjI1̍P+#ZZYGu.ESnf2)cUDId5ܡ t` [@͑ p#pQr-XœEׂ8E# Q %&!|Հ4ZPR"E%P{E *kWMjPPkq "3G"V0) 6^; yQ!Bakr z %_W(""a 1ǘ!@ç:P3 FbbEDA@F(H!Aѐ-ZF ˆO-?oEt=r‚$h>88(oϒh3ă:Ӊbbj5JA\j\H0Qsk Ce8a{_bm)vP.qSԄ%6 [~EZ'\GE0x *| Sqj>aLg;@J#I)4"X85k0B':ϔk@BPG\ߚz?ތ| MXSVwn z~?U >{W.>,Pٟ p}o/оxK0cl6|3.oE -lQdr{S4?&S%uʒcVs]+.f\?[_ck/lvIiJDAt9.gc~_s'n4DT.̦ pIO\Fk@A kl:BXEJKΫ?9y{68xt1e_pŠk|bAATh_C _~R^59#ũb5D`5& /V5Õ~f<58 ?j\#0đLEiӴ+G63dF1RYt*9#0k8lSF c\*U4awӮz$PNg=6$]E?  wQd F툱zABDV _$ZxF9-{^iQTTX1H AY8rH5cdFq* s uZ$`<ؚR2Ig"*%?,}ǭ 0'@@TYiH7+Epӆc5x7]O7&LgǬ|#Np^JØڜ9b)k7g#i4,wJHRly@GNk5[_ ِl,3& p>CvPM-!" 2He~/9QW4R qWgMCy< $P)+5Sg4 UjEZEfS$9=q6'HfM(=ܰz*!MAZV٪pgk3˧`t{Gra4_a%^ {;6q8= g^ "2)7"hJOw;'@aj>gBD~9n35IH&h7J*i#j4knIʃigڭ ELItCV!s aKdxkF)G"Rnsv !_6)9O=xk/{Wm@_K2cHY 9 plعꕭn$_gqu1kizVEv=U CS -][vPՃ6Zr~WЈ!!8dVdF+yt̏۴#;=zB 33H kC¡ -E1Edޯ!*^A6apcAi7Xؑ_]7LӍ_]dK%LomK(XIWզjfq*8䖝bG۾?VY7ݾZ>iͯnyǟ.Ow*$ӆ\-u_j,ml7xZm,^ɏg߯'k^z$^_pHԨAd xYKY%,096"%Lgm[#5$S=a3jWOmV*E%n-T[jKmZd[(^mg=ÁLQmSCpG>EUcro|uOT7]57B( *gIt,>ƒ,łl 2O e\;/pj˿KJڱ!U1"b21g T)-V=dGZ HKkgB[#" ;* ڛE{I(֫X  h!H2BI.8~O$sPo .X-)D*NO#PRqPΖlV ?g-w5DazLJ)FZ@XFJ2F5 y뚼B(8x@U/)Qң@YGE1D4ȈF ,m2Mc0bNFٸaj+ڀ)KT{:Tf\QqDoѰ?e@< a84P֖6-0mcT[ SfyP 1IUf/-+sIm`H@UEh41Qfj( ϼ8Qfi{n6]K&։\f0Iٓ0u=wz<_t&ĐxIrͰ#/)/4Aʰw]ϮC Űck]]YFD*(5/.]adXF$X|F6 bP1F4jq{gx*ZԽR4"˝֔?AA;jmz=v<'T?S^j괶Jh/dC^.T GJnRո H*E^BJeX>s~%yBPK'LW}"Yx,܍moxjxIҷ~z85ƼzWېvW H雏Vs?__#RTYQI0Ő!Ny}פ+}'; l)nG]N:#-Tn9N#4:8L}F63Qm{#$lnp1dohP7!h98 PZFB# 9yiX>(GeŭDzk5y'@`қEY0VtJۡVcf]V>U_p teP[+wes_Gmgv8Bb[eVZB~,''IsUZ0{t? TUN`֨`xyn~g/}:5>rym%rg듳oɵܳ.oo?v./&]ũݾg]]]fW|qrraOq'S~_QE鿸C|?;gݫe6GvCڇ\WqاL.$,ޏ! L^a1YMr}hL"ppkr(D=V'mt:]=o6-qܒqrw E6׸(tqk9\ƞ&:jedY4Z©qÊ-zB 5ȜW+TQ~ InWSƎn/C1ZX`%W)A_?:Y0ojYq&94I_`J:8u: z*&Ħ;Y1Zj܆JlaJYi)OVzV3'@P ƀ䅫:I Q)XVcq4lu"1@;St,IH`Wi9O:9,2::GTIȡEy.%BeXȰQxgTTV$6O-gke!k*ߒLPIF '̬h"SNF2a)/ub,UKtƀ%j-*4\A2GbQۇCSeD1G>Cm挊Ydxu`OJB9rT<WɚQSE*q볦C R clvl!ZŽ6'q[TmV*;BsV*Uv9;uJ6y$aҶMsZ8*lRU_@r %竨dGmm{z{%TF<7Eq J%8L1I֊+dfij[aPSl1Yx^ Ԥ^: QhEZ`9BtǤۯ7ړ9u^vST'|(]P2+U˖C 9-=\l tk\.neKal-)cKgKtÖ^K8ҽ#^q6%DcVңG7fp;:}($Q)34LҏHk@ms0Τ&"0[:& n<ᔲ) (^fKI{#m mBBqEĖ> h*+J۬C͎dmmJ}OAamJXGnmYe9\jc( GmIIErbR/Yu%$F5obneb.Pk{bܢ*ԽF"u+5VºḭH @-dGn`ڬLR %竨A)⸭TQ9AX6+P.Ԉ6+5]- 5 fԈ6+PT wcbjM\zm UlǶfSucj#znG1>o?b*||]l;lmк׍g仵{㱭 sN.KWyM_t Oc۱-#>0B'cbu4,npp;F4 D9^fR_]_KÜ_dH"K? EONZӕ{*"^rS-T͡_i@t~MSmuP 9IA!F1}1'C*[799>hTJ vn hpfʍ?&Ͷ?imM|&ꊷ)ImTmlRMu AH VSyS.~rrsS^ΆtC ]7mL]F^<u"a!pxoZC~U^ O/%ɟrm1L~w킀WĿ u:e?NC˅Oz=.cqxsW> Zg(d8#lrǤKCσ_xFdT.RVdLjۉw)wo*8wOY._2~}F3ĎԲғL$iy"`3F;)aP0,T!(T ПsոTᘖ+V4 E1uB}>}TLNJglj7IL$Js5Jݵ95g>YmeI%%[c B,ǖHEc#9Ba$iaG6t1<;MMnqx,(~*E9Ɩ-K|բ][d(GtsRѿRnM9)znY-_s-?%ܸͯΜ]1.ꝫi(FjTAn_ڒp/Ez!cG gAPP3c3s=I:պG7jH})唇@z-;Qޗhʼn`/|ڤ"%P:fL9rpH/E%m\\J3)סr%CWOyl6XN0R-r+os8gmwخ'|W>|yp*Դ:˛ۻFxuuy5IOߞ9;3Pɕ S3V$SRx}{~ƪgz؍_`gL%OX $0&ik-[ht{FrM"{zF VwXU,V}>lcQZK_z=da lobY|&jCX᡻K c|t# j3mwScBk%m?(JqtRAnA,|Rʯ,'c38zz +s\Xåc%A 0g&,Ir(8ѯhqW7hqݩC{0-yv`!"l"/1`p#5ItqyAEGfŹ(%o7 [0quD.C5ZOfGpL2$^ 0"ZI1}nNV|!yƂ`!ҵ0E`6MIG+`Z PX[-B Hߣla~7m 魩M2֭VuOwTF`|ej w`~ > AlmTN voc/B9%\Xu:,Zb5$:g&C.l B1vB/?0:ťjf,0;U3ͬlLd#7l]sX-\&H- UFdD@#ʦlCx"qՍCNl VI&KG)ɢ N1JoSǼ9O\gkx!9>"hZ^ۧYbfB6r峐42-u{ϫQ{J=$p(΁o<{AFF뛏luI.fs-3c yD$I؛xޤu[]GT.^R4[)K n= JCY;vƞ]NE.PR{PM[qiXd{ΰ,.9eR@"dp|LV -W=j}R1-P㞺1cN!狈ALea(Y%D)O@|n޻ <Ĺ"v*8*;8g X]~ne?٘zy_WT:xd2h>,]l9 ?X x{t*ė<}0?>ܣ#]p^[Bp⣊)A&O.K:0欉¢CieZ%\W9si-.{+eKq1dzS1`:2pReL0Z2Td{?4iRI0:mdݛ\Rvq:->wᆴb_]n Y:>a9IsIɧblBܐ*tk*wj)6ѵw_d꾗FJ"sV?ZS=:]|$/3W=ÅSe:0-ǫ׶6 U.lMAbF*΅C} яV00i&.Tʊe3t,rfޏ /ϑ\ɚNpx,ג}^_]=ظuZ^7=|YkP{X j5\尿\BFi#IF]ka%rR~4P܅9 ]|m>/rR +c~^iL:j2:U0UقTqG[T[NF =xקrƄy4mB%[r9̫?}])rUbZ:V[:Vd:;^K-\ɂ9/n|&.W⢎A]M8﷉dDu +P\[a  rqrlN I {D"# tnVNOFXBgp`Ԏô?+-i.*w^I->!noI伏H#53g61?k[II%o$MW߸;MXw+V*q)=HPo៹ag S$=7Y,. b*XWR+ g`;}'xEulW?Q#[>Cz2骐^5s#8]%jumhSns"eZH0"LDo4>ĕCeh\9/塮ZpSqXdE{c07C)Yp1! L5Nc5oh{' sje d)!lԜwo3RUl\1E$w>~w͂啻~^g[dB25iSYy:2bڰ\%( fA[Ƥߑ)d[y*;xmo ?]l! G"ٗ9KȭRjA.bFAJ.Dԍ%̝ ;j*$G0"a?jL9S%K1cu{k`*"$+sh9aIt&P*)rW氒$>Z䚝!31),sxP^Yuns䳏Ȏ.Pz[>1NpH.I'G8 ȮC]Ck{P&8M` P`|(Fp܃{즁b}/_% r.ӑe#6fO}?G֪O0 NEY}ej9YD}B%t}vy{RO(}uՂK{fo`tRtQKeHQF=-64:EFDt'9?^~3ȋ]_绍l WUCd&ѫyJm]ͥe3jȍs% á:X6*|۸;-Sw rptm.6Uō&jt,rQ3TA@ya2M|{t-Sy*-^|L(t$Q9f j:rpBk5kBi &qgJ6Z+5G#ء~a87_}Zڹ]4 o#(WSuG|{Ģ8FjP

_dz7y+3 x3j~Ȫ!懬~A1'"% ] Z9 ІMuӸhRL8y5jEp~xښKXGJ&"˺l)nL_ҎAeZ0c(׹%Fg6Ny#OJ **6VJT.8J`x0i =B yXĻ'Qr< .JeZ`4.b0q֣`I40h''_E.xB 6 )k%S ?lԪX5ڸj$/YQE~nΥQhF*E[̚>.fJ kURȱu*/Bn;]iuT)["]~^~ }[|mm_ֽ+͛rђ>D)6)>Eo,qB}D 0YeCV!3`ǙO?e|-MC!H83͜?d Ge7sOq(űMM~Q?ia8;Ww&?8Dk-AbR)XQ**M *:6 ܳE>FI2 ֘3M8};Ĝ fQC;~h'w(U?MjFrbA΁0%MX(0ë鹵JkK&w$i412c*wkNJ^je>h[䒝u6 xL=ioLds>C1}̇#ļbj6kaRx!l4.&ҙ#R³\ ~B z+E1㕠@v?ٻfqWR5v ]5/;{wj歧(JJљgvv/e;i%YtZ p$ѣXAH' dljh]U9 "^3=ZV HcpZP:>gGND:$2tNor<9'o[Tݾ|:\喅q>J8>0\@a'$<E͛y1t@9'KI"d7غgLjċ?ҫ?tO I>'G7bp`z-y9  }]^_m /&i=P`V ]?/x1G6TK1QFz?/7U4xkUZ:; u.nN:rP҃M0%_y7>Kz:PAg;r;WoSn ;(Lq>n멱$Hx3Gu]S,v!9D1zYxod*Z*.Ou0έ\!PkmGFZ`iR#- &q265dNIĈ^z@BZ%!m|}ފ)Lvmut!g ;ܧ~>ڼڟ_TV=Ua1fcn/s;mP5R+iYa!1 &6Vlg0!1 jfyb]rL3Jf[|HhhX<#c@d5p#V)¦qY{) |MY Ɲ&wC l\N6des_/iy͋c3]P t $/Մ,L(2͙"57|K cE)L׵reTu(1XDz6^9F$'lXij-&qb&`2MĴ$S]WmJX_%eksT6'-v\yRK'cTSvJOZF}SԶ ԐІyWUckpR}{AqI+{KZ7}x{Bi#.Ke#|m'кC^թTXW@U(¯"m[lQnTԛN톚7\1E+ xR0zuC(uF zLU QoM7 8!;C8x?6}! BJq>#[iDb}` Ja*,  h]<ob\#U$W E1L͵O(B-b]^Epd)bc=j]c0LH|CLjyrk&Z$#>]T. Bk=^ .ngG;9nw4[gw8ᐠn mw[ =>5]Ғ1eӟ~Zt(+:v=6O@ޞFnm V5ObS  mZٚ*e_B4MUUiBU<݌ӎH,>QXu:1R* oBPJV0[3_MUl0aÕntQMUkŘjEsv檳N6E [Xq&˔|KG7|p9_rCq[| |KO^`0[(0+,O\dM>n MPmLHM &T)Eo8[d%rw\ER=\Q!d3ƶWoCkf#[~~qߤ-Au_2/?K(K* ^F%X{kkzWu%b#.See*XSz~)Mc^eT1 -J IƕO͘cr%t TLb֒{٭x^K,麎77j>+ބ7~3N~y1kPvyQ`x9RkMP啇7SGt$GRI:uJa 7mbcaB <åä=\jo%eĥ%J]<\ 6\h7Pr@DORK߂cu?%Φ2GIL1BsY:̻yD z:@g;JrWim|һM0g-LXлՁ:cQݖt62wߌJݦ@wL) @ΚIt5p!*SvQ.G"%s#fe0(Wkh 4bH"hΦhLறOAhw5K'Y=O&Ipd3MT6(m]S޶c޸gwb)U߫:#{ up_\0foƻ/q6w48r{-mw_>m>k.z9;:Wc v'vɿY7GvWUg^Ww}CsE ޤLˆmYx_J.(Sk/4gF`0`` @uPOMjFX!RU$n 14!x\) $K"G+[/өG]f.*daxwJMq6691efAkL,MŮӹE]VsOR>cneu\+@uNk@=AR*PCNjpN(kZԡbR4@5NwRr:PWkR?J }rJ:2O@5N߻yPꕽy)b+I1quiNq>AH`ά w~m~Pɟ>Mevײ_7Zߧկ?Wϯ?+/4&4Iax~I; .>teOٌa z[i%27,/oU~3n+Z3tYą"d{)@ 댝^s@L`Ag,ZXhE[ 1V!&r[oXRvXP7לmM[fs/2kOyƽ.v?31V>l4F\Va|fmwiV%rp}|8WOI K^ Pb@m[aئc KL4RqSsj615'X`"R6Nۖc4mjlc xZ u% "y8BmZJ'-ƪGT"mM50\!8m]ZZWy}FJTj?}s.ZbLҫȢ0MsRձH9PgȥLloJT6zt#g]/gz}˩Sm;7وpI_?Z@]wDeƑp?4XJm1Mzˏ=Y&͏.ө@d1wJu\TһF))OrK_JI*:Ȉscc'<#זc(. `N5uU c5&ОHqۭis)㔝{s+( áPU-J*Ȱ'Dn(UA~QjYR˻WUOR[=})}fU)}Nj=.?ir̛Mx?ȿ өNPQ[s.T"%^I袆%  o.j(VHI(:FPpF&+t+?s0 \QCHZҐ޸ ̫0Ǵ !/6RRm24=Y*jxhr)jxfmrmz8$wM@{\ZWٴy?fGVO_^M-叇 0WW/jx/Cs-Bl""lۣ N?VG50 J'E<eĦ.Yk|0+KLX[aˮBE }B8v:OqDJ%rט`dK.`(beߺb@GUN&:3^ WbB{1(hpRӀڋY'KAE7hTdV#"%Z^ݨT?.GVOϤCt{zZum^~K>!e%Y)B(o)'цM% "fL^C+ Wi\PxaA׮ R?J 1PaW!x(ŠCNj(%ԡv ޘUPҝ6/&7JER r`|6$lꊓPن}2@5.٪Ts (m4>^N);p9U-'bx܂*Q5ץ~:T7JPq(EC)BRK:3^J-Pj̨orŧ݊R&# 0?XjՎ5Vq"W&5D-g5_۬cVKu 5#*3p-`p'ׯ7*pډp Ӭփ*r/|XRޞpt ׻Y$ G ^c*[J&l)/WT&l.0o=l n2|}rcqyۍe x^uCF ʡؙp6f#cM6l3rqHSP [#9枍ǻE &5Evok&oNR[Rt9IU䷘hRvḱ$-*m=Q*k^{Tp-i%T_"ǚK-*GGGT"m  P?GOz]ZZPy`oDu}C}5[Tk[bLRRdQ&׷:YJ[Pykřr0UZ<>mXUu97}F/)Z*`9.J-%:QO> EJCP:6cF'=hK9ygJ1%*Z*e @8 G%H;Zf_C6VJI76}єTɥѲT`, 8cJ(,LFdVnd S0/w"&@F@:%@Q vR%JGeNy ˀ^CC1faO'~xb:$#( 2+XV;9XB)p%,TajZjJg!RI!oWPrgPSǬ|C=*zzd#މ+S~z[ 9D?V WC|?+~| BM;~+oKfJ={x}w@@\"ɮFS5_Oό'ǀ(y= LīW^0D@3#\Ыg0bA1ʶ|oyx9R6?kF߷W! /{5.W駆V &aW݇RMf@IұVz]yU"=ذۺ?mad}* ~ r.j̟fC{wΐwLWp^UL6zPӫy񱺘8.fSަ ˽rUy?YʽZI슘km7u*s6l9齺wXzShzTaC硰Q;H^z(wȣ%!Ρh@?zGd']S#]%jSר-Bэi)ݩ&%R : ֊C0 S[1$kԖXUv҆St0c죌ϟR^bW^h m)lY xiŁwmZhؘuilˁ.2&#š@\dM62^"BCvr?F bT ̪ɸªPBa(HW iI έqFAUr3D"U? d7Fqc+`ڂSm0[!$n- X,ϔ셬 P[_*me8s[H$K"1>laT`NF,8\Lw&FO L㪀 ,52$J@"KAKq{.ظn%RR&緝O"q zw¥rf/]&Sh ޴fr,{N4OɝuAS0B\ABLN oKQ8nŝ|ԩOlGWM*|;& "/ ߛ5 ύ($XK*cJgkʙҐ)kjĐsRÔd ZL4FRL'je _/3yu6Z|+⠫4h Y-ovek' !Imw I$i`C" k"abx1L2̺ޝDS%L:؞S;H kr&wQvcwjßõ9`1,B=揳б9R7c޲7{ %dKn^Hljmko@l',->"2;J%Id)&{(tt'6ܼԜbr2AsXLk/.8%>Xv%9 8v-On  c [Jd@4qHRp`!i5'm˂&[rvO73~z,'&1V*ǹC( SkA+J&@J1 'F8u;EEc!0"d|GN P3 (c ,=G20\S8gr$=؃HO\Jb)1{ +RRK@GK P!*`Qc~sWfQGrWsmB [za{y2"X:Α$3!n2&Vbod=, 0G2K%T> aidA6fQ> =GmbXTɖI/N}ܻJ>ۧn'Xrɧ+9-ntCR۲mz) X׵[vi,Ey4P[JӖ+u^g`]#󆽹}T@;!ڴ{ Q:".ck1kDK*\sA fKŋ<$ c.v*ML%Zz]򼎌C\ps Te։r@@YJ$"F#32DTCtp_~*,E + p)JG"ڱ2:FdwQ;ZU?SxH!z<ܹ2 RF*إFL;MB#-1~ %,$62 H3m㏉} ssfPnF13)-&jw9]P!`]VmɘCA)y%ҖBqh5"Ɍ0KN̥C 1.JB?F9SX9{7H1'T0:V#ÜȠ2J%Pr̠R4d4JbXa C P\ˇ{h"qm }uV]GߪOoCݢ1Bb7ߠ7??"r!GN>ܡOW V\4Kgۯ_y2/Vy}ڥo>~c$9hWX^9;ㅒjsDF+fc3;1"wc'<}CA7(RֹEȟ}(tI i,,VCqWk&.9 )YNxB0Y&bBEz/_RiN|d޼>Q] e:=~&XHv,HF0|ܕb _ݸ%uˈq c+UOO*vU%Ç*^\㌏W﷒UYx U>~zp)'#C Plmm\ "CH \Hh1'C8d4uXzcC,x웁7xa`OqK6Z[%>jlDzcݩx6Cȇ70nZӝj0goKP-2 VjN(uGՍ(m{wnf&#`:ݎ~Brjkkd(`~ \TH}MޗW%/f2x PcV.T rsNQB}Z{V~Wz_.h4W$wEt-16tW)YTHCCqMҩ}Ĺz֭y:5$:H$F5 ~ {D8FJ$)I"id (S8#6۩gОt޼UߑJ$| j3!Y`zXj:m//UǐsC@MsUتӸ3!p8]kIO=7c\aR*|Eݍ `W.WY!E2w1ӁͿBsfі~T8@AXQ8BޱN=&FogmP.2&#B@Ùu"n!S.wFI0 ,$" L ɘu`FhiE m0#we͍H(L8je6<1Ǜ7(CZIm8}R)RB$iR*e"2CV1@#kʦ[yaXH=[H+QH8jk&6 }Q Zjn"E ƨYj%Ԕ]J@. "vwXCRWnK7ӱheH=)Z$Q [d.}>^JuXi1K>"b< ]/eW"Flq]hQ 'hNn"~{B CݛA!0 H,y" 봀Viz ,2HjX\zz]VUoE.vF N.UNiЦHD58C: x hr|ivyA^QwAAS8OIHOP`\ȴ `;(Z)CeX+W462/ag ҅fdMx6~To1ۊzN֫mY[W`#pws.T[ߥ6G{ J36,k`_suK\9.IERXUl|R"عjV]-PRV@VͷN[Aws>Դ,D86&ou$m䙲*fH޷cQ'yľI%"ffܺ]&&tYmpC^xlq =\\0 PbSnX`u-zɪ6q&dCc]n\dCcf- Qz 0z"6ۼ5*[`>B2ܵ e mCkAK'@FAॗ(A0E^iPs#*Q4F iKVu %[Ҧgb'֧?mEB `trkI&6,WAzlXEMP4"FMj?H-霗f2,>J<}4]GO֙Ee+BRhK٣ w!@ O,L&&ʲGz'Sp! ecΗrWPCpN:(6DtX ŞL)JP.P C0:xv{(( ZZ)}8FpMk锎iݳw)"VlP b UᲷԹ݊ȯ]#h}58} tH7R.u:Eeu?Em퇦Κ?aBn5G@~E%|;݀+*[օED}M[dHG{ӳkq#tѨB‘ͿWiqNiYWn?!x^ v^dDۺNesߝJ'Hh{w*ctC8fGmφ5?DjeWp^cبi_'_fڲ5@hګ)UIྍ&hqj[jFsP JF-YfcN2'a!߸))ޭ݆bPb:CŻ.:mڻ Ϙ+ncX7nl GO`A Fv]sPlj6<7лa!߸nڔ)llBGTv\Ǡ7DDф-e ?- (>t6wK=&ɾۤ|߽KSU/=&O.I{F%O>Ѓӻ_ոJs|HmsmsqN3˷7N*Z|dUm}7O,S]򶉟_ou: [+xWMrmvF/qT4) tr-{+'pOvy-_W^.irlB"'KV|'M5x.&5Cf@W~yͅgkiA8F +=9[<*JvvUnͯ>ءV !]g:RB;YУ1ay@H n0vZ  FxjҦі-y~o)O Q;ӈʨcjڑ>_U{"VD0 vAڦuA^:Ja;qOn:8ĦY(\(eA,HQREUMtP`cD"#hk)j]fk \H[(@ZdE(QaА DC /0hkBMewiP8 Aj5 _a(,B?el-qr ^u9aåE+1F) QRi1zQᬓKfI4|FxuQLJ6xtG lbz޿ lbB3B3cni(TVV.OE!R 7^gqkzN(L2Vs t𨍒F4`Hdv6kNJiGZxWDf[#cVA@=Q `u-dFdrwBIJ#j bvS!Π OHc QM]dV?z}>EyEZr]ruwܤNryFheiۏI]j[w6{p1Y3?|wvX׻ 9[3Ec;/`|R,jCێ$pwh RtMh/mB/8CVn!4sMSKNc_kD`B4>xPI9d*(R= vWRkeOaJ%2>'նJ=[I[VeVURu$9VDg+=E+›%z社ږ4}[i|&0Bݱ7I91m{W7}WO-Vc.[ 3!51^EhoǀdXOí9lݒ0_}ŪfuܪCh I&:Y;F*y$O-))g/ɉHI{ cyR|kPW D)GL4SytՍE3n@8rOC"k=@i!T*׆n:als& @ ϜͶI/,Ĭ;̒p#I .( :4as׎0Z)S4:B0x0~m&ea"EP4i7?% Tpz6 h]45i0HI\m0F^D ga`#Q'*0P-"9:78I[ -g cZhCP54@TdV jmRH" a.&.|u߇tCe=u`q#F$k!P}#*}rr&9͜eL+Oyd H\^\Oһvw>yʉy&|߯> I4079'iOj>ƀ{^C.}Y20kQPz1 c 9YEsR_mK,KN9oƾE,?H*I, GgF'M1D q~ 5.<+)ܩ-s%E q,"_͋Pw . "䳀mP[%]h E mLQM͊&Bd27ToMd. O'(1]%:.{|OWhQ:'|{q̐Ɔc{of) 0)!ccPgX/Րuj:6; P oNx;G"t;؈KdZ0$<>4Ȗ5)g0&uځFk&ƌt}df`pfqNHN`^N-m -\bMMv9/D D3*SrQiM T|%XYP J~ YSn]MMV>{@X݆bPb:CŻ.V:e9w3]ۻ+ږIc6UZ#W޷doM__C-ۻC!oR}:RJP*.BVbߌ*saf heGLpA0 $[9ECG/DwIϹ *Kՙum/kd}]yneN!2~ aI>i Yܹ*\> ߚC'#B2c-SL(P0uy렸 n|Jl5Ce\/**yS4 LMo:#+Y7tLXaډ߇Zldr1{/ Y$Rb%Ma&. P&& @mk}NFHGZv됒"46C `!F:o*tk!0FhSd-lu >MSڬ7 A! ?l~AmI˭u3eU\|(YYnEr, @^-RK@7ZN}yЦ)UEc@r+vȋw˥_,wA ∗Pf&Pݓ>wX#X(0.$Haξu+O|m\_$ֲxe 3KCva4{c/NZRRjb$2XWVcwRȻVgwfj!yN W:HSB{kB푘\!4zQ!&BIVB(B)-iB ,0RAnd!Q) λ9MS1E0 V!$ϿO>26H6X6e"Bh"z~ --IEJQPTRQJ 6VѴsuss4?jT{qw4RYWB7\jO\z\ty%; .e:KkW'r)cb\hE$A[}࢒dee@c1`&rZ:M L"1`]珓l%MXB -wE´7\Z؎KAq) 4jA8@LLRF˹{+nS\RQA;ԸeJ(/x_j TAJ7}%~,QhLnu9}jlsXҼ:igBnLdSNTDŽ}@-;'aaxb{nKzVFK8i[ZKnj` QnSQƳ&(UEcf\5nzSޘ KAN *Q>=GOFr~!D-8DBA"`z 2x'Sێ',yWrR#y0@44IuEh#`7''mH J|$.7 Fcv:i !&L 2 hB,5sDWVI-J9RZ<&dP%/ytE3@8 pA[7gVd++ěPZDnib%TFNP!M!Ej޸MR̳&BQ\MFDfC#<`*嬶CL[C .+yUr$Ec(}~P9s;DB1 !R!D%UNI!˲Fs*_j&,QS GT%CoYͧmUqW}, gǒOl̴*~FO}\炍)!h,?/??zR/ݣ?cLpqF aE3 ?_~ ed ݇MA o>^|1T( gr2a!^YOAeZ_11q(qvwkōm^O+NQ/RpÔfg?iu.y%z7-2B2Ē^f\j:L-^pdLh`| W n l*=rךSRs0~ 7L2*=VLWѢDBR%Q!蛸b;lf#6^: fC]MhԆF ӻThӊ6JFC5f4:4 mS699^8D7jFg6Ɠ[O?=z?"^Ey }ARn&&fWa*(I%]+*ɩTN8-秚 y}s+ /:ńdj1V.m29^Kk /S"A׳Q>Ogݵ/o_^&n@1f=a2y "k,ej$(E]D2j$'vƊύA -u]}^,hws1/ .K<L.P*Ofy}Ul0^{:b7=~|\+ ܡ=2Zx]sA5A7w#d?Z*~I=ܫPƛƼFCy9TZY]8Mq-+˶y& tsQzgtyD>+!2}+LQ`ཤ$LklFn푘m[h\4rKz8^eQ zXre!pL~ w+3 tRߍ:V"q+bGsˮBM/"49-FnFn Bύ+'+lXRVȍC&x\sR4)ZRA Wa% `єDYs5FuSg]чGniobEcU7v} \fS]|6O߆k>K8|EZaE~F˷ʷ;|OE^fEwVr jn$D(q[O,ijjHzx[կ,vv"Z4y<^hf0z AePTrD党2`Hq!-!NI5ai_ns2[q-܃ۂ 6>&xBo(ݺ u#GB n;\$J>8O΢x,ΗMɃݺ u#GыX1tG+Q!rmSerd\ôXk)+l5L!+J%;!V:F $ꮊx ß4NK酐ƖY ✙sc5f>۫碼X%}s7qbkw]8~7gzyR2qQq,;O]O11pK\r7hZ=V8_=|a6 >u}(rȯy\"`s3 7F-.IӤL ]sbbvPqo/)~tZXfA^:f랙 {jBt9%j='׷d)g[O fK+RIHI}pPlyǛ=Q6=: f`Rt<Ѩ8E t<Ѧmb14=r;J 9";.jl O4T"(eTʐҺ #t (-ZZ2;q)$%}= z"pE)R[nmĻ3JJD`6(ŗA*lK_CF'+f`wF:D$vR{~@-0}+\p9Xn.WP X ɕbiqQ!pj%^*A(@E1(>7&_rCbo=<T;֙P75jFfHzµΆjF`plF&='TQ]0*eRJ[P!=QSURVؘ y3mDլzjvw3(G9Rn2mH9ԦvWyZ4E端=l@hS|73{9k*"(.47EDA)PUHS*\'DAV5[7%!;`)$Dm_FSBmyWB.pjH(Dz3Y,@W#Ka`G9THwDoAM&\B2~p:t`qJIWD!:*qjLA%ĐKA<낪tn&W1`35wn<#룳⳼- n&Y[[|q_]~X|m:{[8tW"_E6jqs (ʯXx&x ⍩,%>+=HjK.-3 (pQ_ᒱ2ֈ+1҅E"ŷfW=kV:0<ӿved3.OZܽuCeTE֗GR9J]9\mh`2")+Z:QɸP4`^$ьI.4'LH,r68Z ➈J#;Q7 Ne޹eޙHy*nGR/-oY(mWi#sc"u4:j(8h昶 V >(֏ϥEmJU `ڣ%õv]h"֮bQ]*}). }gy鑛cmߪ_-c)%e%~4o\F~A57-?>\QBeAXtƿ!gߝPAojAzvئDbo>^|#dųm9TC! 8r=e$!ZqcCJHM7'5@p?.y%z7.ʹFMuAb`)BK#g-(#iӃaLAb 8r'׃Mb0Ln'G ј3fZ N14~iad쬛d4fh5fF4 xCPyү Gþy84m{,{" j 9˥ZQ;U8!Uc MĦךz7`,V($wd!-v$uhuh#XWn%6eΘ1x[ RL9m,Pѫnm MtM%tpYN Af=T8 3udApjlJ 7q0J"{[iYƄO81M%Vmd 9f3YL.c2 Wh#9_A,ihehoL~(9Zo/aSҔv7;lD"HBMۺVEcq,R4ui- b=)ӎM ZnKt{T^8Z[l;,B11`ARW ^R)Wp26OF`z;I\Y $jdv%z6X7!ƣP3FѢ˜5R{JB{mF1l* ]Լn[cJ!}+:k^JZIRpt{'Y2wE5#Y yTc}h$hZԭ:PiDء$ a6σ mII^Rԛm<󘥙Gs3qFúVJrpD%lW90lhJd#ΗXeQ>o(LðͿmH}W_~َgde"r%@)}&xD.e$QDLL9߲k} Fkwj!DضE2R_=3s+$'vV݁sh5@v'y=͈ $@Բ>e ҏs:TcQWRTIÝs;))v̔\>mEeQbiLSph_rںFgG+0Q`9n-QM0z\*DSm$DR1W}u'D0ψko_7tVHIK0 B|9Hr6 F0`N$W¢!^*,j}R%F*F!bByĴ"j4):?#0s+)|qMF+"_': l-kA$7^VQcW-;190 )*o\O2ZaL܌)TƎ|`ʸ>Deܜܰd}($-bF!2. T$:Ȓ4s*FJ#etNJ@ebkP@c2ro ݆`16$:1idPi7ݘ0SHBj7 '`+"1D&2z[ہxvoc|1K5"jDw#z `"(LhgY EN7Ncr\<敶Qm/e +O[1qjm'_m]j-G+[Xiq\,YdA3* 4OtbCàIrK1|ր%S8W ,g)O;O䉀nnf<z"/Sz hKZ]yM#1KNfKiv1r8صy8ՆBu]ZI2DWHc2HAo*}GrC)D1,+7"bл[ RL9mi/ڻW%z1,+7"2#eojقޭT)Sd%ʻTMD2+AĘv|;NTp!-tGbEb2w$Ɖ[Lڕ1,iҖB+TT{ J'E :$ֹYqoH4AbhkiwoEs25l4wf Ve:2.n\&ྻ'|?n$"gW\%yK:kgBy]|ͺ}z}\>kmǧ]yn;OxCgؾd{ܐ;8.˿~Ʒ4بԥx[}uJzpV[ d]BCSKNRTw}(J| sT?SݭƬT"%m*&8zd^aZd3*vwMc["pԽIXðP̘e0@.,QIoJR*9 ?D%dPpD۫QKqR^R)0o-Ʋ_` KxX^%)`hI_-:;@]'ڢ 4StE(1rk6"]!LÃ=!~#}?'_mwm1c{~gkR4.$+4;gh1y՜%`sF+a5̪1ZUA!ǧi!Z(*l w+i$k]xIhotRMzČțөחԵyC,;x Zgm7dVUH-Xk]}g5]RrTKKx94l\Ok/J b'HηڞׁJ<>/-e`ޗ0h-{9mݐdw߷ܱBTHmw=HGม5p=9*7_ ~"㋚5I`3^ʘ F>h0Ai Tqn;J:7@\Mo *Qjձ晇>^GZ l=-#OKDx:/px"FNs^| V8Oj:Z H 6B O3}4ZB9ۼƜ(D^IŪ7mH~8>#eX)[--S[KWkU6U@$%yO8~e KfbX:+ZU0%i"%EW b NZ=Iz8nCX'p5Šug.̴]' GbBp1w+i+CgNR^99fenm6JJ$8&?֠QZ6K1`:ФVmSwq?vh &R 㪪9Ҭ7.$7_PIC<` dy@:HDzi )^ Rj[[ "[km5e=F( yPǧDkot O@S@i"NSgt<<,ktAr Q0VntZ2c4N#O% q)hZ)|6'9f7Qۧ./Nzue?|}X}w{dD2usZ8Ix:˫t-E\\A *V֐+EmLpז;`adŬc#-3pmܷɉ&,io,ksIgD8Q)\ }mt9aH%l8n}Yt]yuc'v jZ(mTO_M֕[kz˗F6Cb|;߿5Q$ܮ_/W6Oo&h6şnnW[|C4ǾDZ}ůo KhD?՟ FK-c5;>ƅǣ> (7U1=6(q&|@bTMd)պ @덈h7q}ٞ_ln +TOu7ͦ%n^՟7wܦ)u+%nJy?o./ۋ5JN}v5 [}a_4"ׂܝ%F{kYOůZ\!/\Etʈ{Cfu+Aꔽ#ϺM"=u+DKnUDx8du3n)#m'%'/p+Qnj9׮@":#l AԖ X$o%huЗSTZ2bemFSe?H(}Dt¡H S, Ҕ뤏)hA6z(zZX{}Kcd%z7V7?z 6 cX;jqkn]~^_o} ᆔ hoS[rI+;Z| LVTZxǓ[nS^ɸj~2$#`Lӫl:0fቓhO l V(!:) @R8ZgK#dHfXk3e-HT͞|K%%7{%9C)k>>f0T4cA,٦Bk9GN@ sY}=D&RheDcSmsK]ow#O ;6Oo^L% TeI! F],9u0Y8\)B./7rr%۞Ϯdr,s84*j@׋U> -26JJ_@G) Nlvǩ~w@jQzZ>'G  [T&R@uP8)#ŁK&ϛ$:mu2MKSBKL5ղ^4[^\cKRK!4 | 0k E^֨0QY10(c"E48p؇`05PPWpW~ =Csަ|q&;@cE- xZhqF NP<ηMEO=I(GB&Hȓ}LTI9I>Xk"]*m?ߗcȸ\Y%Mr6/R5CNF~q3h'_6ߜIK @tmOyM⦯3pR]7f@[92K$Kdtǣ%q":r: ]nmIG`Ʉ2H_ucޣ*7bbX\3c"]&l|s'`(0iD4ox}kovV0>L&{aQǻէOgO]0~!(pӯ=\=Mx>eSip s<$U㉪ŒqN\F7. ,)m$B&+HȑC:R?AZe:AY ,%gL r! ߑ8:ݾ+ʴUl18hq2`*hUHfѶ3D1al^B dFx`J$QV酖-.biz8eip>sK&3}%t}:=CH3.<uU|*qk ";KDuꤝF4'Rr_]m]miz.< uyŹomxt(߯V. d?~l:c5~X՟.vt/qל6d->E}y[dԭS lOR4kYvV§*|I()<4YX*mMtw}Ҫo;lj}O,zH< < od 34ZӊS2&ψO9&jݣ +p?CbTfb`5E;O,OuL'P!c|XnI":Ţ& dCsp^4䅫hNi>nD`Jy:ȱnLU+`hVhАE:ewMA+J B$d# m(Ba zp |T 4䅫ND˦_c=RFGr9Ƽ* h`_l_m0eloW{zݳk_23Fʬ"%NLu*.d|g o7;qtr7{>L!;^ZfˎLTi8yJ݇S 0 uAw$5J-mc@ƔIŠ![/gWyƾ52``^ßB-%J&1JrV=cxطf sbf"kHRY>1 D),(/C̀:BD`ڔN5֚@Åahs !I\YJT29u֪/͆3ʆW{z7)K{Iåʪ67>ݯ{ߩg}?Aݻnosnj(*-l%Y b1z;Xp3@ dj}v8Wg<6 < WE|m}5>hR|k=AGTt,(7FD~yBH@Z*eR 8xvHl`8hڠfܢR2B  F[i CAM X6*/TT s\JQz[GQ+ KQ{md*$}<3r..sܜs]?jǿp 7}YjpӘlNBl\@jݝ3؜-' do)P\l5CZƹl EѴBGۧCwo[ĤB1)I7l#o-c`8hL!c ’CXleGL*\*Vd3RQ&\(MJVR7d/!'AF퍎AxN(非w6;=0ƊJ!_O+$hJ+|j+~^nO*^g(OA (!#^J 9X}^&3N ŠuH)PgGJ .YStl{Cs.ނ.x^=kbJv *#qb{^2~6ԙ( JW̼*Q8]πu|Caøt0+IU6&ѡe[;cLKB͆=^`cw&OũY븳5K De]LDuޒp?<wObLȯ()i]I9BY?JpBY+ j8:W Fٌf03O_v:3Ab'X1!0M8p @jў3vد1)~߹6'k5FcRkp38S*#n.D2f/u>o\OX0}G OrCq|S6E|!:&Y,)%V8%35mb&Tw.9wv|Vݦo^'-q޹l~&i^n{*qIr\"cmS:K3:em]-GRt*n}eK.PN%՞&JT,:Z',0  -*ikG4z.ڽ!QOdgi[ǿ|+oG 7P!4ku˖AsqT>Z%[S>Ot0 |S7|>۠ruAmY| Cmz$&KAs2 "hM){f Jx)&%m +GQl@ Y֌s,>.<=L>$,gIV*fn" ى>j(..3FGXh٧ʅhCJl8ݲ~nr.G!CÍX ==ؤRN) Cu5C0dR.JB EŒ3yRcD+9Z11R+NUl@~RIg<%*8sxJǑkŔ|N)a mml_>?.ǂʔ[O9rxqxYuZrx&c/6釪3ګzs8gZL7MGLh@XhAM2y٥0eB^i052]ن)/Lq^꠨_UFV R9rnZ4bЎbsܘM"k-hRZB +*AF3ezαm7%:r9 ]kpC5g?Ј D.,W4͆hY`-K&7(:˒#ZѲ{zh)K{DKå Z-^-2*FѪT$/ٖ-WQ3 '4 y-+ٱw(ݾ1Л"kz[CZ;-ژd̲yoQdwV(J [\&̟TJFǑ,4p(zO UH7fil.zDz3܈Л{MYjCo.ĘzST Cof,5Л*~Moaƙr2" yVhF)>NU**A>4?gK9v)ui+8|Y꛾ԜK^)R)1Nsݲ&1尡W&-rZ-92@=2ynLcr*keQQ)6JW7}{5J/1Jʢ!;#J_/5W((T @H$3Ȇ1f JtdAyTZ`62ԅN)1T[/I}ӗZrv[$K?`9P ё62"w8ը6#xM[dW)ǚ1)6s3ޘj99yD ")v:C%r ow^̡l F W}w[.=|)}/c,T+YZy (Sbes\;[fN!ܿV4Ow߃'Z.,5hh `eܧB锹:%Rg-F-|E~*k\to]F.&۟ʫ4aM<}[pFh~ kV+\}\SXoZj?돉v~~ٹk{?\狿wH Y}Ip{7A^_v1Ks=&E-nɒnOLlŪb|0O>߯D;]kfuu#8w)Gca0o'GUemST99"7л'_FD8% S⛼} MÇo_C_"ii9NPr}ٶlۯl'z8کSB6͸_^hD&_z.X*ʋiͮLVTsp U9/b/_Vp49[ۑK8x8Gwx&ojW \¤6Sz1[D>8yRMui~]Z#xޥ5w5Mn~}7 kdQ!Q*('98.8+)=}%o7?&?.'o r>}_ٛ9劓^@R9?; )1RX)&<3UZHÂU1yPZG>jT=r# mN^Ck ~:|}*Ej[Ou7?&?Ta^E f׵6O]N@5rmn`V7o냺볲o>+SC}kX<.Igi:7Ig+Ccꏸubܚ6Nhud=vgvψzԉ&L#n 1S 02i'`xks ڋ6h~zY6@^u=:xұB_isdDGηNe. z]61&5S(ղjYS Ϋ׋'_O.8\wo3JsA0XN!ä~ECU>=K˱qH]} (o>7z|s1je-r)Ђd+Ī_| }ԡT^nCn AŦ/ `:lW* >: ]?=_-,6ҠWn5wwKM٩%P;; { RIx RKǃVDP-#H9*D+d)Au1uG Vcb%䋴'э8V)}G6bm;Cˌn}x}JcE@[ɫ33%3,ykϝaiDŎ?lii1ծHZ6ARfLI+w%єKej(8==82J&3Ƽ ,JR`7ר@͉lw ,h*bI˪ZI.Y:gSɘO ּ"$~~~@aןfR%;Lj~:[cm~Z~Z-Gb$y1"VnJx:SE,tdʌГYZjJy:g!I [j ~-~¸0u }B!˜|bFs i/gqb'XKTj8v$nV/FOv-,j5q3TcTGSN,B2 7Ώ$~2ގG;5>C+-)ұ67ˋCT$9;7)27YubH-J&μER3\^52jay@8d$ +c1q!jWjd;Fd*AQ;ڰo7-Dr䃱o=[}x%>%jrTn2XyW)0 `)d5[Cp-)x4=V)}G6bAE[ѭ]tǧ2a'Tb̞yk1{pSd(3Bo#ߋxSnI&(*JiI̔yyLcrڨ%ߕ:󸏨)%O+2R@x+/!klfD:mϚ{}kZMb>G'O)Xy9si{YyivUTz=Ov>qNrO܅&! $X&y|ID*{DɌ`a .dcJ<2Jҽ:&%$uyƤ䛄[YD>5/"o)̧u]~gl+׍lj337_ ̀U7[!j0$b%Iba.]vTP"8j1c&h*e82ƋaMIw <>t{(Ky^hUJ;S@NFf=۫m:FP->&عیݢ+Lfe|*W&ᎊvWbQӛ#dyYùGEbrTEu[Ct9T;-2#bM$!/=nD2n1D\=8&TㄿI{W9Ox: O0Io_-&R4NGerFV#xU*:ɪr`֙i yQN {D2fY_ Ɖ1ˣ67R _ˣ.XcDГRF򼔭QzY{)\jl}? /噥{=^@[u2VYQ흪JPc+ۓ&FU40pˍ^Tf"sE2@q1MɲBFGq\ A%ॷBW#Ė+#F\<꩚Bmm:̵V}`kН+QƐ.k 3z@8>B:V~ƃ!#lw;3Hx>Ec*HMEF uA:!^mF##^m61Y`'NoM)|{k̺^JGTr#L DZD61DmRʢv35ƨj^ROT b W] 롪zkDC( i,E*;8+Ds׏xL(f݁XߕX7%}a͛KOUt8Kw%fc_Oo/yq}z8K+j)ޱ/6Q 0R"aؗu4/H\/t7OJ#b8!e5irhHp`b_%,i,>iZ^/oߵJQݸݮEׁm6!ynRS֠(\huU< dL]yh| 7*ЊMnWS 6Ly8&M/;7]}=K!Izf7egsn5&Qx`.+z-רZ‘2m Cƫ؁7Hk|ҽ&(<6ik\jcx*3̸ E:͔ΘD1V;;T3 P3BA;nUVj", 4ho@0vNZ=R{Nr&9IvG8BÔ`e(A"0(*bc) qEF*,e85>QTqʄcFm:Z^S862 =GPG+33 8Qbmfg\D1C kUp­HRV"1!q́\P,MlKԨhNJ[uv.ouKϿ@Z']Uǿo EQBJ_ѫo?'ToUOY׏ TNMWL~雋T"߯nnoN yhx㛯b2g3d67fniO$=1ZPm6: x.xҞPJibYoWF%)a'd0aX[8s4X_xtl edmǒKd$ uNբJ.Vy.oY\։5!hx}0oHfs]a}>GlevOo} wnxXV?>xqqx<߾l*=ށ]Od1N<5"=/]RC3NұK@ptSns{-oSDdlj&yoY}N5ӓjqY coZM)N- y*GPJZsTh#1贚CvoxcrWDVLxǕc=.1c$ ,0e$_,}WyEZSq!rDj-gRWFNpJȶ<(ڝʲgkrA$AR\b+S5ztƍzg%i$-FQd5:KҪF``ɬ8]V>&mVU1RCHW+@Zc|)[7q^R^iǗի~0:*)KD?rw\`v؃iDDL Bv_(:ɝH\ S& pW"&3𿙕 ^ZXlŤa *_ pUҥí~mwUQ߮2q#1:DFt>DOl8^:UHEr[b)?x?su+ !LCp]ܳO1Q7z^'()(MᐆR Z)R1OثӐgD Qo>)y ^벇B' r{Od6T,Gg;,4mᘒ4ӏ͖,4G`J((aUWvCN=~Ƨ=F Bs`TTo5 `7NHz5a8 қPJB,S(R֓6w# ɍjF`ĩD>ۮ}tbɁwH(cNHO"xq<WܼjT'C]uRyUA9f= r"zyNUF&hV*1K--_!Pivk_?/'nr}o,BoElwEJ0" px|i N[;1(rɶ # ǧ(. FQc;RQ˲22g`^UFb9v!5*+1VR*@L%vc6vGDR-4AH>6"Á8'X!:GJέ%ج*kΉUeMC57+Y{&U1XN ,gRFEV.W:IieGHfƚ|1g(I/TqNN(N$$".9/"jᎻْ刄ɧ,4!j$$ T gq^zZuS,@Nc 1O8Pܐ 譸OF.xRv; 2t!?%0Rܢ@W͚푰9oR3%XiI2x$x,DAE>#*WsN:9(AܖLQr' Br+ D$'Մj\<62Y#Sy0!<;> QyKgx񿰟7+Q,9*Qb^+Q9 *S(UDsX_!sH 0Wr fr"ebGGBHIzc-!܎cULƼVX[RJc+…D& 2Ut PwCYqCº)FRn&2FQcd/I*5G\Y(fU ;g +2SI8}!3#~G)bI`0^}{ ޶?N0K;8Ksvn鹜#a~{vN[tsѼJzZseﱴ{<( t%UwycaRį7C{@Hݕ.S4[O4V ѓe\9oR!j5.b݈%C)r[E1;> f̰m;|3Mb\q*uq8 u$sC7^ t}̅ ^?#zN }=8RiC LǃK@xʜ!mT8c)FD)xCKZrPo8f5T 81FNH?rl;ESlbEB!^" e/*쵫v( +M%DIyqҟH%FC9#{Հ.)cR:@U!cqd%!i8%jB\rWz]jb)SGg+EQ5PS"؋r YiIhI귧2?'J PH99 A ڮx1ܚއ)??,gL.VOҨlIWK5YOOreXb`[/J0-_=>: )f_g|u5|bȻǛ}T&$_A 2{* /&7zf^6Xko!oO®/?IY;7V[ξ#ȀS2+ >1~ V-x+0pY8 - `C8\Sg󺽍ѡȁN(p7x# /XlL 0gGvcbՏo7A?ކ?BMjA>?[}A>|ߔ X7 4*aU1, 5X(8C\U*a3)s>\Dz+z]NVA0 y;/f6;#O&!& viMΊigKŭ@IG:Diɩ׋v45:eZ% (1[GX*a1 (èyv^/evV69w"P ExTQwfYn'pE ]\J SűE K0(JH㨒 yy͒n HHℙ0N-$"2$z3]rqd&8_7e.%<`>X5#/@^ E<Y$X):_C*0+ T#L:+n W*o:ȕ [SYs#Z:2PRޖS͋?Dow2?/lrJT֥ ӹ,ЊzڟZ8-eInpg'wUÞٯV 7[ ?&7 &BІ;ynv&Zbol !D?p"mt8tFz^FK-Do˿)*_eQvEB*Jb+'Ad8(T[TIR/#OARg ς`a eVioCs!iiS"&"P\KfTQ(HQ`hKRIS6IBީ6j1hʕ\3n$V^&〶O($ur@Ra2IqCuđ|MqM,;Cp "7u E÷Κ` %AI3𻕜Ga"ಙ>Za"Ъy=xa! 宴P;S 5^ FJۂJxI!2UE8^+j61k蕭i&(QGx$e}Of{M+ބz /[/2Wwzp#dIժɿ{z|oֿj]o_&nL f˱`ޕ5+SR7F7U)UJnަVYjR^F> J)ER%zoMv}Mכs̓9)Dl1[e暫EWd*PBqf*.t 27E<96ΖȍfR\HɧZUrMb[_(ꪾ)9oˑID;afE&GBf@?ZSD\ƒIn$8NzᆰgFyl2RG9yTeY^o-␸yԯl=rY .6F `;PhVI_2uG׻knzHzX}DnZ娍(TdllaZ]7 K02_=:-}$왦|s߾zn/B-ɭrq6M<ŪQo0zq$n@|NN~YEk->ln@nF^-*u-Jw|E@ר~ % -DUepj\n!Uc E8tw\&ҋ=x|SyߴcG-iz~iۑ`&0[l2f7}Ck%"F)͸jm6dcs`2r&b~$x`C5PKNj1103mFDok "xjT9jFE@snr&r ̦%6!~17_Άpz3{{z>9U~~o0aH?^azvxy:>oAVlߴyw`:;_[n ??>8;ϫs DrY+EyNH?_NNYز ɒƙ_p`x&ﲇi-gy]\hvS}Mu uno:6' wֺ6VQ ~JA3% qTVs΍!!\]H,Gxsvf{u~q`›Tu tjjna/,LupjMk.Z0ui%87OO/O8gC@Qb:RVWgzQr͵}:<X-d,*`U:I(ZォO,2]q3H`v /29j{8(ؚ" .JE;UbtWw#ỉfWOyzpo؈rȵٷ!I\+M\`˶~b[-Xhw{;Ι0~xb@Y}*՜GhKaΌߥ8<DfиC:സ?X|`Dv99ܶ(zowOeb3cnA]\Ї&H#$Jl+ 1_ ߏ%=vClZB5#.٢ 6{n XԈu!$t|?ט@* :*8/6O\z8cVKxew[=gG1~hH>4TX&=Eb"PCK"C8wƔΏ?|ev< 5Yѵs$V8$[ޚ$;}u+<Z>_ מbE$[6] ~+3cJzÉ$I:iZ% TA"?vfow#$@hGwzݼ{ &0R?Fˆ4EXek{ _(3J({ӓOgIq }hah ma kp]s;\ (-Ut>ЛT`a}*8dM¼tɼ~8]^5~n,:?q[}4'Z{Gzv-w}W_-.+=+,2FO |7zw?~y;ɵI̫q!RtZudV1ǯUŊ7" "]oG;۝MqrT1>cC5כ}[mv5 @XgSh<&pnz?ڜ̞Qߝ(S Jd$gUVZ~jMEAuw;芚 ͊|CmlsHx0YF ;ƛ[G!٘ 6;pcdrL\Akl= 9s*B>[jVK1&ǒ*b# MĬ8qsKBmDTc=T]qsDpCK|֑\-Pmw/KFEy[p$qB'q-Et =ڴ'p|i$I1oZÉ8zZ廯U3WӪFy촪eleZz 7F`uN2mnpyc<^ji33m-h36*(?_R5iUw-؍\pV-O*L'H]PB{yj9~ɮV5 KU~ܬ۟y?jp7 G$hh+E!Nϯ =!,"3*$i`[YU,|Q$7PK,q Ir"JNo~h>R5ZFoamîrA+ |:􇃎ʦ %@R1Ũn\=j)jqe?< ;jn l; C6Msf]F e@]n?fc ڸB)+F4l֨[ajէ;R!zR㼛 0/g>e4jnʣߢMȽY RJi|&,DoRʥTaRQܪkYHk#KeDSrn]?GHELDQuًFVмБxy۲fp_Ƨ9Aͦ'iG.*t^U-) ) ܈m--mڧYTԾZ;4l 6Lf/Mqysd]0 ]05Gl]g|h-TeF2#*l+/dˡez* Mx}2hk^:Ch{랈p<*kI4?%$x'`ʿր>Kh2n ko32 /;U3&\P,O0ؔS$*T)f$[a$th w?nt9-Pav.\MrEK P>%Drv! [gt@_ ,B1ZymIӨU%mjQgѼz~YVu/yy-)lmnF/W{{3F7lm.8ɧM0Do!8U54H ÙE%z~hztr/1U Y s<2ؑh|ΚY`7Xe9K"%3dvS]$  T4<6T͢5=U"m;N4 zf$,<pe/?ߜpDŽE^e`!8o2(J ]m+ $Hcam; P(4"ױs V71nEuP\:p`&rkp(KI{e.c,\V)QB[PRi顸ԟ͉$AJdų*x!$P3@D L;<6K}6c 4+ n^~ՀҕP1YT[>KTWPUБP {ki>'w.s Յc1 s 0VA#r+5_zg4&asՕW- rV{ijD:\ *E^Gfo^ mo7EeLiBp̢OWյ7rW5zKhZkɺ yZcܒ..ĻR0ܽ:~i/"ugJ,*<8_0I&LTn;/nnDk^Z1zf$5~ 4fjDwW{p=y{feZ-.QBPf^,iA eVkWVО,+FPGe?zb-imf9&Mȉr=30GHn^l2^ϼfm݇csLQqiUfVB+Z"9YRqJ5|WM5ҙ =UZr+ zr*ޔ{(Z/ q;)r+lNuCNܶ:͌jnm!Ltr7=G"՜5"-G-GzR-M":KyujHTx J@ 8TUy)Ec>/L  jXO:[+GCǦ*FNS{@b>`onO3hrIToN[>+[)VUC#7^]^cTFOzk>QmB}BFnSsOiބȺpwU7vw3ח"vI?z$=TѲQ>jg,}kZ ~LJe'W4cyցphYaҞ?y}9Mg4=B5MoXʰ؀)oG[r[dT4 Ɍ*ywMz_g)n|Fd7V~x`NnNr=TϾiTkH0,Y{6dB&lF$V}C1L[-H$r)mvN.iif)m?PΝ%W]4cоYCHomĒXj0ɹXDQwG|2xKǜ-[UV1/]%VFU 7W"h$ϧu+o{g8AĜG[-XZ+C.u~(m-mRFοnrg ooa<{g[.9≦qwGZccM%h6-=D_/*lRu&芖~䣛GzLG]sOfKl QѪ4ft=9bIz3F:*QR44P P \OF kx#O璋fG὆C09sNUⲠ0ЯA=SB;/cUJ+ycb\ 87l:3ڨ@~Fw5Jd̥R41kd%7qqd cD)Z.,(ס*hI9XY$ XumYuKߪqozG3Eq |?HE'ǡ/c:ͻ#θʘȚegȨ'ӹD>^\_.N\E4Zj=|WɥiWjG 71yf<%ѮؓtAih9^]Xe$ yJJ.@cinϾMe2ؖ^tu:#DJHwM5CJ~m#o%GK!N^w1=ַRI^!&&.6N}ʲG֛@)ڤ6́F/m[ ؼV6[ wf@X8XvZe аf-Va_jk]j-ٖdK-E.ؚ9 նpG)}-J[j&xjiySŘHȏ5fQy|t|őNex_^RW&JW uv1pjAw<9x/ä|tgh@\v6f8vׂ|9uڗJ-FOô!Y' y*IyuSJuAt}GEw4 dn&[UNi-3`ݺb:]ƺmqHf7fݺqIn}h+WJ1E( "WYsRFXQK$f̋i-v"aY$KxHXeX4O\3nKy G5dT1+ Jz˳kYREc|%Hi__l[ Ua}h+hLS?a 4?~&"IȽX+v@Ysޙo#fK~Z /ORf%-ĢS\I`fO&C-bĒ5J`mzI{Շ0 KD踰-,tO&m!Yx*l$m[$] )~ 6],2\ .بajt_EogQ&/s5*2 eXeVXSdEiLaӴg+qF}K^,lԷV=($wSV8v$,\syaRZh$ޝڶ:iۯP7ZJWiu`zXSajF(kFf.mZs)əVU! +4*QHj( VU(yyHt@ y=t%R3vPi>)#XHDP^T @+teiU gXy<^_ 9FYLH/3ep{RCcl $;$xab"V7*,mؖuf|-kbmY:"FJy(ɴgYau)Y6鵰$c V4k.APVHuА4fS%QW*SzX9^UQ^6bX蜴X}Sˡ\ iKm!j-Ef%اP-ˑqN8arsը]:<. 7&V" j䶙]NjT+`]مX2e.i  u?uVG#5lȔ;̨%D3 ~urX4I+ ($y5&gK?:} A€0 v4O0J{i`+Q6zm8R& X͛Ծ>)[-e0HL6F`*^Non[Rd-{9vSǟTl62XaSuz ]Q43'/5>ɥ&gOKХ-tIS9ҮNpF,iWG7Khb A XɺZw06\6a_نA ITg߿ ղ7lPu{z&jC>0=_ O] O3/cbiVMIhE@ nĀLK+dr+5q㇫FIѧwCHzљhR]^Lgc8us'nҷߎԉ+r﫵@Ljg$eC'+-E0Nt6{ A l]*ˢ($JB=,RU݉sT/R >w\F&ris%4뒖fVk7|Ɔk6߲%3>w9fᱮ$c%$BN+U jފ -?-}y-v !W4["B V.-Z25?MzxCmZJ˂`j}k=q).'XU+*řhބ9o  98k|,tǶh烴Ek'?%{YLR~uӹ2M[1ۻ޸qW,PSQO*vl64L&%?&[82E~H䗲$X>dL;7+P.5`+ D`I͉R f (ȍ.xZơBKb5,dq^j4bu@!*}GƄ6=D阂"F |pF`JtSzFnBh-Vľ#dwr[$wMn2|8Ƹwn<ΦN6"n)J:ݝT;e d^ݶˈ5DK۶R/4"֝ԀOQJ#Zyv{ΉSRR/(=ksETZJ7RCR$6V4*?OYfyq%7//NKKk4ǔo )N1jhˏ8DAJ eN0;L!6z8ElNIۄaaW!" WD0/8Nj ̜DIA9%3 ; '1S1y ѭ> ]].XW_|oA V_6÷fEY_?>o0{嫲}V0"?uٺ I_h!Ϫ{8l~d [eZ~n3^;iJEl>"߶)u@u4uɌ̤v3OeMSfc_=6%{ G^l4FAȴu Ǟ%$,A.e#_վ~wݴŇu$'T.H~kU7׍ױ='&ńCrz"^{5EC9oPkϋUWu^Y'X T HSZX deYHs+WR@QTr3q#eU#BsؗCi.U6FD1&p>oĕ:ٴҠir6vag mVS F#mڨ!0fA+m8=igJ>gW U07>6=:(2nj O;}*)1^4Uv;Y/6&J4Z,ԒIԝ;V蛢&~X|SQ+ft6C(J/*G[K V"}o9CyZj+FRA#ÄҤ% MrS%YBvs|?ܧLN.Sy̐o˳;^v"]kD̅(gM^e\7&sPRUbteGx H map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 11:59:08 crc kubenswrapper[4816]: body: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,LastTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.848167 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d74c202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,LastTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.856610 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc7940c4d57c1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7940c4d57c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:36.975904705 +0000 UTC m=+3.567168672,LastTimestamp:2026-03-11 11:58:48.230863738 +0000 UTC m=+14.822127705,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.861187 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189bc796ade212da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 11:59:08 crc kubenswrapper[4816]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 11:59:08 crc kubenswrapper[4816]: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:48.276718298 +0000 UTC m=+14.867982265,LastTimestamp:2026-03-11 11:58:48.276718298 +0000 UTC m=+14.867982265,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.865566 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc796ade29af3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:48.276753139 +0000 UTC m=+14.868017106,LastTimestamp:2026-03-11 11:58:48.276753139 +0000 UTC m=+14.868017106,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.869898 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc796ade212da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-apiserver-crc.189bc796ade212da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 11:59:08 crc kubenswrapper[4816]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 11:59:08 crc kubenswrapper[4816]: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:48.276718298 +0000 UTC m=+14.867982265,LastTimestamp:2026-03-11 11:58:48.281131904 +0000 UTC m=+14.872395871,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.873768 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc796ade29af3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc796ade29af3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:48.276753139 +0000 UTC m=+14.868017106,LastTimestamp:2026-03-11 11:58:48.281179745 +0000 UTC m=+14.872443712,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.877637 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc79415c8fb26\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc79415c8fb26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.135002406 +0000 UTC m=+3.726266393,LastTimestamp:2026-03-11 11:58:48.448588537 +0000 UTC m=+15.039852504,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.882052 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bc7941669747d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bc7941669747d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:37.145519229 +0000 UTC m=+3.736783196,LastTimestamp:2026-03-11 11:58:48.460456216 +0000 UTC m=+15.051720183,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.886463 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d734768\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d734768 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 11:59:08 crc kubenswrapper[4816]: body: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,LastTimestamp:2026-03-11 11:58:49.411094772 +0000 UTC m=+16.002358739,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.890409 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d74c202\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d74c202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,LastTimestamp:2026-03-11 11:58:49.411145233 +0000 UTC m=+16.002409200,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.898832 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d734768\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 11:59:08 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d734768 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 11:59:08 crc kubenswrapper[4816]: body: Mar 11 11:59:08 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,LastTimestamp:2026-03-11 11:58:59.411002151 +0000 UTC m=+26.002266168,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:08 crc kubenswrapper[4816]: > Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.905131 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d74c202\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d74c202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,LastTimestamp:2026-03-11 11:58:59.411101904 +0000 UTC m=+26.002365901,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.908867 4816 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc79945c23947 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:59.414702407 +0000 UTC m=+26.005966414,LastTimestamp:2026-03-11 11:58:59.414702407 +0000 UTC m=+26.005966414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.910735 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc793a41afbb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793a41afbb9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.227773881 +0000 UTC m=+1.819037848,LastTimestamp:2026-03-11 11:58:59.547968573 +0000 UTC m=+26.139232580,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.915099 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc793b692fe14\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793b692fe14 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.537628692 +0000 UTC m=+2.128892659,LastTimestamp:2026-03-11 11:58:59.809172545 +0000 UTC m=+26.400436512,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:08 crc kubenswrapper[4816]: E0311 11:59:08.921945 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc793b7587e19\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc793b7587e19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:35.550572057 +0000 UTC m=+2.141836014,LastTimestamp:2026-03-11 11:58:59.823575016 +0000 UTC m=+26.414838983,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.060789 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.130404 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.132018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.132086 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.132108 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.132967 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.410761 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 11:59:09 crc kubenswrapper[4816]: I0311 11:59:09.410879 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 11:59:09 crc kubenswrapper[4816]: E0311 11:59:09.419831 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d734768\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 11:59:09 crc kubenswrapper[4816]: &Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d734768 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 11:59:09 crc kubenswrapper[4816]: body: Mar 11 11:59:09 crc kubenswrapper[4816]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.411087208 +0000 UTC m=+6.002351215,LastTimestamp:2026-03-11 11:59:09.410843548 +0000 UTC m=+36.002107545,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 11:59:09 crc kubenswrapper[4816]: > Mar 11 11:59:09 crc kubenswrapper[4816]: E0311 11:59:09.425690 4816 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bc7949d74c202\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bc7949d74c202 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 11:58:39.41118413 +0000 UTC m=+6.002448147,LastTimestamp:2026-03-11 11:59:09.41091313 +0000 UTC m=+36.002177127,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.056708 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.310505 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.311418 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.313108 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" exitCode=255 Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.313143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773"} Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.313175 4816 scope.go:117] "RemoveContainer" containerID="a2d9a3cb72a07d0729cf0547bd3c77f1b4b47da54ab64802497189af73f6f7c0" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.313337 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.314232 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.314275 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.314285 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:10 crc kubenswrapper[4816]: I0311 11:59:10.314807 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:10 crc kubenswrapper[4816]: E0311 11:59:10.314988 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:11 crc kubenswrapper[4816]: I0311 11:59:11.061237 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:11 crc kubenswrapper[4816]: I0311 11:59:11.319774 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 11:59:12 crc kubenswrapper[4816]: I0311 11:59:12.059271 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.061049 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.211235 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.225020 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.587480 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.587671 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.588692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.588746 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.588759 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:13 crc kubenswrapper[4816]: I0311 11:59:13.589204 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:13 crc kubenswrapper[4816]: E0311 11:59:13.589407 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:14 crc kubenswrapper[4816]: I0311 11:59:14.056310 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:14 crc kubenswrapper[4816]: E0311 11:59:14.220519 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.058725 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:15 crc kubenswrapper[4816]: E0311 11:59:15.692859 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.701986 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.703738 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.703809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.703835 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:15 crc kubenswrapper[4816]: I0311 11:59:15.703882 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:15 crc kubenswrapper[4816]: E0311 11:59:15.710698 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.055984 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.417717 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.417958 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.419650 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.419698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.419713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:16 crc kubenswrapper[4816]: I0311 11:59:16.424384 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.056340 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.337738 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.339081 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.339270 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:17 crc kubenswrapper[4816]: I0311 11:59:17.339389 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.059562 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.370352 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.370558 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.371605 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.371660 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.371675 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:18 crc kubenswrapper[4816]: I0311 11:59:18.372176 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:18 crc kubenswrapper[4816]: E0311 11:59:18.372358 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:18 crc kubenswrapper[4816]: W0311 11:59:18.789777 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 11 11:59:18 crc kubenswrapper[4816]: E0311 11:59:18.789843 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 11 11:59:19 crc kubenswrapper[4816]: I0311 11:59:19.057278 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:20 crc kubenswrapper[4816]: I0311 11:59:20.056737 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:20 crc kubenswrapper[4816]: W0311 11:59:20.727312 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:20 crc kubenswrapper[4816]: E0311 11:59:20.727367 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 11 11:59:21 crc kubenswrapper[4816]: I0311 11:59:21.055695 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.060316 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:22 crc kubenswrapper[4816]: E0311 11:59:22.696993 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.711072 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.712375 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.712440 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.712460 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:22 crc kubenswrapper[4816]: I0311 11:59:22.712501 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:22 crc kubenswrapper[4816]: E0311 11:59:22.716671 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.056644 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.576349 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.577194 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.578573 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.578608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:23 crc kubenswrapper[4816]: I0311 11:59:23.578619 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:24 crc kubenswrapper[4816]: I0311 11:59:24.058825 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:24 crc kubenswrapper[4816]: E0311 11:59:24.221353 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:25 crc kubenswrapper[4816]: I0311 11:59:25.057153 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:25 crc kubenswrapper[4816]: W0311 11:59:25.213735 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 11 11:59:25 crc kubenswrapper[4816]: E0311 11:59:25.213792 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 11 11:59:25 crc kubenswrapper[4816]: W0311 11:59:25.878611 4816 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 11 11:59:25 crc kubenswrapper[4816]: E0311 11:59:25.878669 4816 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 11 11:59:26 crc kubenswrapper[4816]: I0311 11:59:26.057644 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:27 crc kubenswrapper[4816]: I0311 11:59:27.057707 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:28 crc kubenswrapper[4816]: I0311 11:59:28.057690 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.058189 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.129767 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.131452 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.131511 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.131532 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.132377 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:29 crc kubenswrapper[4816]: E0311 11:59:29.132663 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:29 crc kubenswrapper[4816]: E0311 11:59:29.706181 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.717292 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.719435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.719495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.719522 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:29 crc kubenswrapper[4816]: I0311 11:59:29.719595 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:29 crc kubenswrapper[4816]: E0311 11:59:29.728339 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:30 crc kubenswrapper[4816]: I0311 11:59:30.058217 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:31 crc kubenswrapper[4816]: I0311 11:59:31.059482 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:32 crc kubenswrapper[4816]: I0311 11:59:32.060144 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:33 crc kubenswrapper[4816]: I0311 11:59:33.059166 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:34 crc kubenswrapper[4816]: I0311 11:59:34.059119 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:34 crc kubenswrapper[4816]: E0311 11:59:34.221812 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:35 crc kubenswrapper[4816]: I0311 11:59:35.059360 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.058620 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:36 crc kubenswrapper[4816]: E0311 11:59:36.714383 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.730547 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.732709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.732755 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.732768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:36 crc kubenswrapper[4816]: I0311 11:59:36.732800 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:36 crc kubenswrapper[4816]: E0311 11:59:36.739051 4816 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 11:59:37 crc kubenswrapper[4816]: I0311 11:59:37.058659 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.059482 4816 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.567749 4816 csr.go:261] certificate signing request csr-j57dc is approved, waiting to be issued Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.578002 4816 csr.go:257] certificate signing request csr-j57dc is issued Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.660848 4816 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 11 11:59:38 crc kubenswrapper[4816]: I0311 11:59:38.915607 4816 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 11 11:59:39 crc kubenswrapper[4816]: I0311 11:59:39.579732 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 00:43:05.175896057 +0000 UTC Mar 11 11:59:39 crc kubenswrapper[4816]: I0311 11:59:39.579811 4816 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6372h43m25.596089728s for next certificate rotation Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.130092 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.131912 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.131982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.132007 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.133046 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.402708 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.406586 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd"} Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.406941 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.408599 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.408630 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:40 crc kubenswrapper[4816]: I0311 11:59:40.408641 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.410848 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.411383 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.412889 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" exitCode=255 Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.412923 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd"} Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.412956 4816 scope.go:117] "RemoveContainer" containerID="10676b0f39f00de057afcab21eeab6daa9b8e1f4d47611dc59ede1bad64c2773" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.413085 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.414036 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.414055 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.414072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:41 crc kubenswrapper[4816]: I0311 11:59:41.414523 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:41 crc kubenswrapper[4816]: E0311 11:59:41.414705 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:42 crc kubenswrapper[4816]: I0311 11:59:42.417019 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.588334 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.588690 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.590267 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.590306 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.590315 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.590933 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.591098 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.739826 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.741435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.741487 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.741504 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.741633 4816 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.751705 4816 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.751952 4816 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.751976 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755928 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755959 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755969 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.755995 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:43Z","lastTransitionTime":"2026-03-11T11:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.768983 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775728 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775762 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.775775 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:43Z","lastTransitionTime":"2026-03-11T11:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.787099 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796606 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796716 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.796725 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:43Z","lastTransitionTime":"2026-03-11T11:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.807809 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816824 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816886 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816909 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:43 crc kubenswrapper[4816]: I0311 11:59:43.816953 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:43Z","lastTransitionTime":"2026-03-11T11:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.834794 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.835023 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.835077 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:43 crc kubenswrapper[4816]: E0311 11:59:43.936238 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.036736 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.136984 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.222829 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.238116 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.338554 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.439162 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.540411 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.641290 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.742448 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.842642 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:44 crc kubenswrapper[4816]: E0311 11:59:44.942911 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.044001 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.145101 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.246171 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.346887 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.447504 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.547646 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.648585 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.749000 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.849418 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:45 crc kubenswrapper[4816]: E0311 11:59:45.950444 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.051380 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.151834 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.251919 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.352838 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.453935 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.554369 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.654739 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.755677 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.855833 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:46 crc kubenswrapper[4816]: E0311 11:59:46.956156 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.056338 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.157041 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.257977 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.358091 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.459209 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.559648 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.660661 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.761769 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.862919 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:47 crc kubenswrapper[4816]: E0311 11:59:47.963316 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.063772 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.164745 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.265230 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.366047 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.370338 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.370522 4816 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.371969 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.371997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.372005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:48 crc kubenswrapper[4816]: I0311 11:59:48.372499 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.372724 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.466602 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.566989 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.667691 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.768696 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.869949 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:48 crc kubenswrapper[4816]: E0311 11:59:48.970400 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.070917 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.171502 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.271938 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.372061 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.473144 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.573541 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.674595 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.775820 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.875934 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:49 crc kubenswrapper[4816]: E0311 11:59:49.976808 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.077846 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.178450 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.278653 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.379760 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.480625 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.581533 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.682508 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.783414 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.884323 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:50 crc kubenswrapper[4816]: E0311 11:59:50.984726 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.085609 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.185772 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.286420 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.386730 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.487585 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.587781 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.688968 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.789560 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: I0311 11:59:51.857895 4816 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.890188 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:51 crc kubenswrapper[4816]: E0311 11:59:51.990841 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.091711 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.192206 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.292931 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.393449 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.493880 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.594820 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.695877 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.796968 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.898077 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:52 crc kubenswrapper[4816]: E0311 11:59:52.999291 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.099520 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.199721 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.299871 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.401127 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.501394 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.602273 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.703377 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.803986 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:53 crc kubenswrapper[4816]: E0311 11:59:53.905219 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.006438 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.034997 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041123 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041178 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041238 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.041307 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:54Z","lastTransitionTime":"2026-03-11T11:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.059330 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065228 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065284 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065307 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065331 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.065352 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:54Z","lastTransitionTime":"2026-03-11T11:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.082306 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087185 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087230 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087249 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.087311 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:54Z","lastTransitionTime":"2026-03-11T11:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.103474 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107697 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107732 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:54 crc kubenswrapper[4816]: I0311 11:59:54.107773 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:54Z","lastTransitionTime":"2026-03-11T11:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.124296 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.124558 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.124610 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.223486 4816 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.225677 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.326450 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.426550 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.526717 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.626939 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.728122 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.828914 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:54 crc kubenswrapper[4816]: E0311 11:59:54.929928 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.030216 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.130914 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.232156 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.332378 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.432564 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.533597 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.633754 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.734325 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.835456 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:55 crc kubenswrapper[4816]: E0311 11:59:55.936016 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:56 crc kubenswrapper[4816]: E0311 11:59:56.036322 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:56 crc kubenswrapper[4816]: E0311 11:59:56.137194 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:56 crc kubenswrapper[4816]: E0311 11:59:56.238118 4816 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.273780 4816 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.340943 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.340977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.340987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.341003 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.341011 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.443914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.443967 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.443978 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.443995 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.444007 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545703 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545739 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545749 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545762 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.545771 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648286 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648324 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648332 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648345 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.648355 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751948 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751957 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751973 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.751983 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855614 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855735 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855759 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855783 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.855803 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959054 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959118 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959160 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:56 crc kubenswrapper[4816]: I0311 11:59:56.959176 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:56Z","lastTransitionTime":"2026-03-11T11:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062273 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062356 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062387 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062417 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.062441 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.079332 4816 apiserver.go:52] "Watching apiserver" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.087172 4816 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.087699 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.088433 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.088585 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.088800 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.088892 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.088963 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.089768 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.089789 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.090040 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.090235 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093212 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093397 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093520 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.093540 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.094241 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.094306 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.094696 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.096584 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.135343 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.151603 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.164718 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165511 4816 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165523 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165656 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165682 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.165699 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.175980 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176037 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176140 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176201 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176290 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176341 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176388 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176432 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176483 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176530 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176554 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176573 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176617 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176631 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176663 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176715 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176759 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176803 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176856 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176903 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176950 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.176998 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177041 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177087 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177134 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177161 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177179 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177224 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177307 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177327 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177353 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177359 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177490 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177511 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177531 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177442 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177547 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177637 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177664 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177719 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.177693 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178075 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179147 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179203 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179413 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179494 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179757 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179902 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178395 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178747 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178882 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178926 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.178946 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179138 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179633 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179784 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179910 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.179937 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180208 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180289 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180309 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180325 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180498 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180560 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180611 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180665 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180715 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180769 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180818 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180873 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180921 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180923 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180971 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181027 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180392 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181079 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181135 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181187 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181324 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181378 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181436 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181485 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181536 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181638 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181687 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181782 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181874 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181925 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181971 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182021 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182443 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182498 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182551 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182610 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182659 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182706 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182872 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182945 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183001 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183057 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183106 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183154 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183203 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183341 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183390 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183439 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183580 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183638 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183690 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183742 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183792 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183843 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183891 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183939 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183991 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184043 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184092 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184139 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184187 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184246 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184333 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184381 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184437 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184486 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184533 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184637 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184689 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184844 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184892 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184948 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.184998 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185050 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185155 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185207 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185300 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185416 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185464 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185515 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185563 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185613 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185666 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185718 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185768 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185821 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185873 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185930 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.185982 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186083 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186336 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186446 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186553 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186604 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188165 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188226 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188342 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188458 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188520 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188575 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188625 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188695 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188746 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188800 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188857 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188914 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180577 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180573 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.180846 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181069 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181413 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181449 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181708 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.181722 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.189093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182111 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182230 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182298 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.182451 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183018 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.183718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.186949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.187151 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.187752 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.187828 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188025 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188382 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188836 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188936 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.189525 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.189751 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.189646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.188968 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.191482 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192129 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192160 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192270 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192298 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192319 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192339 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192361 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192394 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192416 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192435 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192456 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192497 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192516 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192536 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192553 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192572 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192592 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192614 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192638 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192684 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192709 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192756 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192785 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192822 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192867 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192892 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192912 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192933 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192953 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192972 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192994 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193018 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193036 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193056 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193082 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193146 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193166 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193233 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193249 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193290 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193302 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193313 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193324 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193336 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193345 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193355 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193365 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193374 4816 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193384 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193393 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193403 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193413 4816 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193423 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193433 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193443 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193452 4816 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193462 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193471 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193484 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193494 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193505 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193515 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193524 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193534 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193543 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193553 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193563 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193572 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193581 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193593 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193603 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193613 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193622 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193632 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193641 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193651 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193661 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193670 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193681 4816 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193692 4816 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193702 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193711 4816 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193721 4816 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193730 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.191398 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193764 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192022 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192052 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.192727 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193823 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.193126 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194407 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194583 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194724 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.194806 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.694769863 +0000 UTC m=+84.286033930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.195115 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.195245 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.196613 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194817 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194833 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.194867 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.200511 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.200735 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.200829 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.700802036 +0000 UTC m=+84.292066123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.200889 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.201819 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.201985 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.202057 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.702034541 +0000 UTC m=+84.293298518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.202235 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.202425 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.202570 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.202949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.211568 4816 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.213948 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.214487 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.214543 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.216622 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.216757 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.217036 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.217851 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.219229 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.219313 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.219341 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.219598 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.719561663 +0000 UTC m=+84.310825690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.222033 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.222239 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.222302 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.222322 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.222393 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:57.722372373 +0000 UTC m=+84.313636380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.223392 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.224146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.224644 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.224687 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.224733 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.225305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.225560 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.225706 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.226104 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.226613 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.228517 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.229869 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.230707 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.230978 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.230985 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.231472 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.231823 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.232065 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.232484 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.232660 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.232794 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.233213 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.233600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.234665 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.236177 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.236586 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.237583 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.237753 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.238386 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.239320 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.239347 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.239563 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.239824 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.240105 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.240849 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.240908 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.240859 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.242630 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.242855 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.243180 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.243198 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.243711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.244926 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.245813 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.246504 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.246555 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.247402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.247671 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.247703 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248005 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248019 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248413 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248444 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248772 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.249187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.249193 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.248931 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.249582 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.249682 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.250772 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.250818 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.251604 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252118 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252462 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252688 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252718 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252912 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.252921 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.253587 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.253723 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.253958 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254575 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254903 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254899 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.255589 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.254389 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.256500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.256773 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.256881 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257072 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257145 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257290 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257319 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257362 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257435 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257493 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257579 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.257602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.258003 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.258223 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259150 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259453 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259779 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259785 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.259923 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260250 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260308 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260401 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260508 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260965 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260512 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260832 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260865 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.260908 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.261313 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.261563 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.261633 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.261790 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262151 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262166 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262575 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262617 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262730 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262860 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.262962 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270512 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270558 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270568 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.270603 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.275911 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.283926 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.284238 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295022 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295164 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295177 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295189 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295198 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295208 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295217 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295226 4816 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295235 4816 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295261 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295271 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295314 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295325 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295335 4816 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295347 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295358 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295370 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295378 4816 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295387 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295396 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295405 4816 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295414 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295424 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295434 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295442 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295452 4816 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295462 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295472 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295482 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295491 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295501 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295511 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295520 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295529 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295538 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295547 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295555 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295565 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295574 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295584 4816 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295592 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295601 4816 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295611 4816 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295619 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295629 4816 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295638 4816 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295665 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295676 4816 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295687 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295696 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295706 4816 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295715 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295724 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295734 4816 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295742 4816 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295750 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295759 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295767 4816 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295777 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295786 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295796 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295805 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295816 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295827 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295839 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295851 4816 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295859 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295868 4816 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295876 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295887 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295926 4816 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295935 4816 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295944 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295952 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295961 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.295973 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296002 4816 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296016 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296029 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296042 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296055 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296206 4816 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296226 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296237 4816 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296276 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296289 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296301 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296312 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296324 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296355 4816 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296368 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296382 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296376 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296394 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296460 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296515 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296541 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296554 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296567 4816 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296613 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296631 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296650 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296709 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296726 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296773 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296797 4816 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296813 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296861 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296879 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296895 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296942 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296960 4816 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296649 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296976 4816 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297086 4816 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297112 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297132 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297154 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.296459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297173 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297191 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297209 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297225 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297242 4816 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297313 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297331 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297351 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297370 4816 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297389 4816 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297407 4816 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297425 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297443 4816 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297462 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297479 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297496 4816 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297513 4816 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297530 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297548 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297565 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297582 4816 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297600 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297617 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297634 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297652 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297671 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297691 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297707 4816 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297728 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297750 4816 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.297773 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373646 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373686 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373698 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373714 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.373725 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.398819 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.412082 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.425421 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.431194 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: source /etc/kubernetes/apiserver-url.env Mar 11 11:59:57 crc kubenswrapper[4816]: else Mar 11 11:59:57 crc kubenswrapper[4816]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 11 11:59:57 crc kubenswrapper[4816]: exit 1 Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.432396 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.438124 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 11:59:57 crc kubenswrapper[4816]: W0311 11:59:57.439738 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-0ae7d384c6c6aee635dc8d8b14f1b73fb77069547d903b12453a02c7ef21aa57 WatchSource:0}: Error finding container 0ae7d384c6c6aee635dc8d8b14f1b73fb77069547d903b12453a02c7ef21aa57: Status 404 returned error can't find the container with id 0ae7d384c6c6aee635dc8d8b14f1b73fb77069547d903b12453a02c7ef21aa57 Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.443582 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: source "/env/_master" Mar 11 11:59:57 crc kubenswrapper[4816]: set +o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 11 11:59:57 crc kubenswrapper[4816]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 11 11:59:57 crc kubenswrapper[4816]: ho_enable="--enable-hybrid-overlay" Mar 11 11:59:57 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 11 11:59:57 crc kubenswrapper[4816]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 11 11:59:57 crc kubenswrapper[4816]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-host=127.0.0.1 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-port=9743 \ Mar 11 11:59:57 crc kubenswrapper[4816]: ${ho_enable} \ Mar 11 11:59:57 crc kubenswrapper[4816]: --enable-interconnect \ Mar 11 11:59:57 crc kubenswrapper[4816]: --disable-approver \ Mar 11 11:59:57 crc kubenswrapper[4816]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --wait-for-kubernetes-api=200s \ Mar 11 11:59:57 crc kubenswrapper[4816]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.446861 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: source "/env/_master" Mar 11 11:59:57 crc kubenswrapper[4816]: set +o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: Mar 11 11:59:57 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --disable-webhook \ Mar 11 11:59:57 crc kubenswrapper[4816]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.448077 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 11 11:59:57 crc kubenswrapper[4816]: W0311 11:59:57.448651 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-92a1a9e28f924bb0f7181cc7ecce3df3c84d39c2f581ac287f4f874561bc443c WatchSource:0}: Error finding container 92a1a9e28f924bb0f7181cc7ecce3df3c84d39c2f581ac287f4f874561bc443c: Status 404 returned error can't find the container with id 92a1a9e28f924bb0f7181cc7ecce3df3c84d39c2f581ac287f4f874561bc443c Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.450584 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.452272 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.459919 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6ff8b55d8351f8419526fa7b016e8b28378459404a831b8bc005d60f9b1785d3"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.460897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"92a1a9e28f924bb0f7181cc7ecce3df3c84d39c2f581ac287f4f874561bc443c"} Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.461103 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: source /etc/kubernetes/apiserver-url.env Mar 11 11:59:57 crc kubenswrapper[4816]: else Mar 11 11:59:57 crc kubenswrapper[4816]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 11 11:59:57 crc kubenswrapper[4816]: exit 1 Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.461800 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.462213 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.463295 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.464132 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0ae7d384c6c6aee635dc8d8b14f1b73fb77069547d903b12453a02c7ef21aa57"} Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.466327 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: source "/env/_master" Mar 11 11:59:57 crc kubenswrapper[4816]: set +o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 11 11:59:57 crc kubenswrapper[4816]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 11 11:59:57 crc kubenswrapper[4816]: ho_enable="--enable-hybrid-overlay" Mar 11 11:59:57 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 11 11:59:57 crc kubenswrapper[4816]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 11 11:59:57 crc kubenswrapper[4816]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-host=127.0.0.1 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --webhook-port=9743 \ Mar 11 11:59:57 crc kubenswrapper[4816]: ${ho_enable} \ Mar 11 11:59:57 crc kubenswrapper[4816]: --enable-interconnect \ Mar 11 11:59:57 crc kubenswrapper[4816]: --disable-approver \ Mar 11 11:59:57 crc kubenswrapper[4816]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --wait-for-kubernetes-api=200s \ Mar 11 11:59:57 crc kubenswrapper[4816]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.468116 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 11:59:57 crc kubenswrapper[4816]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 11:59:57 crc kubenswrapper[4816]: if [[ -f "/env/_master" ]]; then Mar 11 11:59:57 crc kubenswrapper[4816]: set -o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: source "/env/_master" Mar 11 11:59:57 crc kubenswrapper[4816]: set +o allexport Mar 11 11:59:57 crc kubenswrapper[4816]: fi Mar 11 11:59:57 crc kubenswrapper[4816]: Mar 11 11:59:57 crc kubenswrapper[4816]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 11 11:59:57 crc kubenswrapper[4816]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 11:59:57 crc kubenswrapper[4816]: --disable-webhook \ Mar 11 11:59:57 crc kubenswrapper[4816]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 11 11:59:57 crc kubenswrapper[4816]: --loglevel="${LOGLEVEL}" Mar 11 11:59:57 crc kubenswrapper[4816]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 11:59:57 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.469239 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.473760 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475603 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475658 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475677 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475704 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.475724 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.484461 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.494523 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.506027 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.520192 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.528570 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.537506 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.544777 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.552918 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.561567 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.568809 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.576293 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577497 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577519 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577527 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577540 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.577566 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679550 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679558 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679571 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.679582 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.701343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.701427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.701521 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.701604 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.701572465 +0000 UTC m=+85.292836442 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.701674 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.701659957 +0000 UTC m=+85.292923924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782220 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782280 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782289 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782301 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.782310 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.802768 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.802828 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.802849 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.802926 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.802996 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.802983127 +0000 UTC m=+85.394247084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.802946 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803053 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803063 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803115 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803179 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803205 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803112 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.8030947 +0000 UTC m=+85.394358667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: E0311 11:59:57.803349 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 11:59:58.803315756 +0000 UTC m=+85.394579763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884599 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884643 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884664 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.884672 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987379 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987452 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:57 crc kubenswrapper[4816]: I0311 11:59:57.987466 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:57Z","lastTransitionTime":"2026-03-11T11:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090412 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090470 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090491 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090559 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.090577 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.134798 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.135396 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.136757 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.137359 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.138264 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.138721 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.139329 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.140246 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.141023 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.141990 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.142467 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.143643 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.144181 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.144757 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.145625 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.146110 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.147043 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.147447 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.147980 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.149068 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.149506 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.150411 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.150868 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.151870 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.152359 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.152917 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.153927 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.154386 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.155316 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.155726 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.156529 4816 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.156624 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.158213 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.159053 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.159451 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.160847 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.161592 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.162394 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.162973 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.163944 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.164384 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.165285 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.165839 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.166811 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.167270 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.168091 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.168591 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.169611 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.170071 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.170952 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.171415 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.172218 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.172751 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.173176 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192537 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192677 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192758 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192860 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.192946 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296395 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296510 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296604 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.296680 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399473 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399523 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399555 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.399569 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502549 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502597 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502613 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.502624 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606453 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606522 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606551 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606581 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.606602 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710104 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710165 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710186 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710213 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.710234 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.713375 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.713521 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.71349412 +0000 UTC m=+87.304758127 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.713577 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.713742 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.713809 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.713795079 +0000 UTC m=+87.305059076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813041 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813116 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813153 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813185 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.813212 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.814039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.814131 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.814187 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814330 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814352 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814361 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814394 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814417 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814455 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.814427728 +0000 UTC m=+87.405691735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814373 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814493 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.81446882 +0000 UTC m=+87.405732817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814500 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:58 crc kubenswrapper[4816]: E0311 11:59:58.814582 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:00.814562312 +0000 UTC m=+87.405826319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915643 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915693 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915731 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:58 crc kubenswrapper[4816]: I0311 11:59:58.915749 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:58Z","lastTransitionTime":"2026-03-11T11:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018218 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018330 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018354 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018386 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.018409 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120703 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120752 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120769 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.120779 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.129929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.130041 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.129994 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.130598 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.130868 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.130971 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.153162 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.154582 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.158799 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.160175 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223058 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223141 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223166 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223196 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.223219 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326479 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326518 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326547 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.326558 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429647 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429715 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429731 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.429772 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.469865 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 11:59:59 crc kubenswrapper[4816]: E0311 11:59:59.470067 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531577 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531658 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531680 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531711 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.531735 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634164 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634559 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634632 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.634695 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738433 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738481 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.738531 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841508 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841543 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.841568 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944532 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944544 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944562 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 11:59:59 crc kubenswrapper[4816]: I0311 11:59:59.944573 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T11:59:59Z","lastTransitionTime":"2026-03-11T11:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047930 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.047948 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151561 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151619 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151636 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.151677 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.258799 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.258874 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.258888 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.258905 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.259442 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362471 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362528 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.362542 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464796 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464844 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464861 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464879 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.464889 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568226 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568313 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568330 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568353 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.568375 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671178 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671279 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671299 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.671354 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.735552 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.735673 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.735850 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.735969 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.735924319 +0000 UTC m=+91.327188326 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.736061 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.736036852 +0000 UTC m=+91.327300859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773754 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773764 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773780 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.773794 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.838653 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.838784 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.838853 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.838988 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839028 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839033 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839059 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839087 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839087 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839107 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.839080171 +0000 UTC m=+91.430344248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839122 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839158 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.839135462 +0000 UTC m=+91.430399459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:00 crc kubenswrapper[4816]: E0311 12:00:00.839195 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:04.839178513 +0000 UTC m=+91.430442620 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877852 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877909 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877918 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877934 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.877943 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.979919 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.979993 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.980016 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.980063 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:00 crc kubenswrapper[4816]: I0311 12:00:00.980091 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:00Z","lastTransitionTime":"2026-03-11T12:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082898 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082917 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082944 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.082962 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.129966 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.130046 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:01 crc kubenswrapper[4816]: E0311 12:00:01.130113 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:01 crc kubenswrapper[4816]: E0311 12:00:01.130237 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.130343 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:01 crc kubenswrapper[4816]: E0311 12:00:01.130542 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.185907 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.185965 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.185990 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.186018 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.186037 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.288906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.288968 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.288987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.289014 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.289033 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392394 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392463 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392481 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392505 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.392523 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494361 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494408 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494424 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.494435 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596826 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596857 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596884 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.596896 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700368 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700438 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700466 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700493 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.700513 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803734 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803778 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803787 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.803811 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906610 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906688 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906712 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:01 crc kubenswrapper[4816]: I0311 12:00:01.906763 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:01Z","lastTransitionTime":"2026-03-11T12:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.009460 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111662 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.111703 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214517 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214555 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214564 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214581 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.214590 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317355 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.317426 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419108 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419166 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419183 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419206 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.419226 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521153 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521220 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521240 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521294 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.521314 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624410 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624438 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.624456 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727490 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727548 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727556 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727569 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.727636 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831015 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831095 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831120 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.831173 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.860191 4816 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933797 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933859 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933876 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933899 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:02 crc kubenswrapper[4816]: I0311 12:00:02.933916 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:02Z","lastTransitionTime":"2026-03-11T12:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036686 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.036747 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.130333 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.130410 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.130411 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:03 crc kubenswrapper[4816]: E0311 12:00:03.130660 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:03 crc kubenswrapper[4816]: E0311 12:00:03.130772 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:03 crc kubenswrapper[4816]: E0311 12:00:03.130834 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139478 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139536 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139556 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139580 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.139597 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.142053 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241513 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241559 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241576 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241598 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.241614 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344331 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344389 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344412 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344441 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.344464 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.447216 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.447333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.447977 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.448023 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.448040 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549848 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549891 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549901 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549916 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.549929 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652661 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652705 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652722 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.652763 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755434 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755509 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755532 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755560 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.755581 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858691 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858796 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.858817 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962852 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:03 crc kubenswrapper[4816]: I0311 12:00:03.962914 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:03Z","lastTransitionTime":"2026-03-11T12:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066020 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066080 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066106 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066137 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.066159 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144588 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144608 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144637 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.144656 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.150861 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.160673 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.166741 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.166814 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.166830 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.166855 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.167060 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.168343 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.180984 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.184102 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185414 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185480 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185503 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185531 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.185550 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.202297 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.202466 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206835 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206877 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206888 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206906 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.206920 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.220417 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.221642 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226583 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226633 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.226650 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.234423 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.249764 4816 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"91fc6571-6a6d-490b-83e1-c64cf62c773c\\\",\\\"systemUUID\\\":\\\"bbfa0147-7ad8-4a96-81ed-304e5bc4397b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.250216 4816 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.251945 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.251982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.251991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.252006 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.252015 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.262448 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.274984 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.285243 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355136 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355177 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355187 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.355218 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457566 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457598 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457607 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457622 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.457631 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.559974 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.560005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.560015 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.560029 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.560038 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663498 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663561 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663580 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663609 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.663632 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766453 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766491 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766499 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766514 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.766523 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.778143 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.778305 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.778477 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.77842292 +0000 UTC m=+99.369686927 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.778535 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.778682 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.778647766 +0000 UTC m=+99.369911773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.870845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.870925 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.870949 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.870982 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.871002 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.879397 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.879483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.879522 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879650 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879728 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.879702508 +0000 UTC m=+99.470966515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879876 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879924 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879943 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.879990 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.879975575 +0000 UTC m=+99.471239572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.880485 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.880534 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.880556 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:04 crc kubenswrapper[4816]: E0311 12:00:04.880625 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:12.880605573 +0000 UTC m=+99.471869570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.975145 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.976146 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.976421 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.977063 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:04 crc kubenswrapper[4816]: I0311 12:00:04.977118 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:04Z","lastTransitionTime":"2026-03-11T12:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080104 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080158 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080168 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080183 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.080192 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.129610 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:05 crc kubenswrapper[4816]: E0311 12:00:05.129760 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.129820 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.129991 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:05 crc kubenswrapper[4816]: E0311 12:00:05.130005 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:05 crc kubenswrapper[4816]: E0311 12:00:05.130200 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182379 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182457 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.182471 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285302 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285320 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285347 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.285364 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387711 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387766 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387774 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387786 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.387796 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489237 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489292 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489303 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489318 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.489330 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591673 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591726 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591738 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591756 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.591769 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693583 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693630 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693643 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.693656 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795613 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795663 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795676 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.795703 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897627 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897671 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897681 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897702 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.897713 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999903 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999944 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999955 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999969 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:05 crc kubenswrapper[4816]: I0311 12:00:05.999979 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:05Z","lastTransitionTime":"2026-03-11T12:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.102317 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.102791 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.102934 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.103057 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.103174 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.205968 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.206019 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.206030 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.206047 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.206061 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309351 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309398 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309408 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309424 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.309438 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412545 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412585 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412595 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412609 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.412619 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515238 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515323 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515334 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515346 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.515355 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.525101 4816 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618004 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618041 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618053 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618069 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.618080 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721111 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721235 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.721306 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823366 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823406 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823415 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.823438 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.926932 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.926970 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.926980 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.927012 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:06 crc kubenswrapper[4816]: I0311 12:00:06.927023 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:06Z","lastTransitionTime":"2026-03-11T12:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029586 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029663 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029681 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029705 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.029726 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.129720 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.129767 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.129739 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:07 crc kubenswrapper[4816]: E0311 12:00:07.129852 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:07 crc kubenswrapper[4816]: E0311 12:00:07.130062 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:07 crc kubenswrapper[4816]: E0311 12:00:07.130110 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132673 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132726 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132743 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.132755 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235833 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235902 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235926 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235956 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.235976 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339214 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339308 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339328 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339839 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.339899 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442624 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442767 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442856 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.442934 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546370 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546483 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546549 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546574 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.546596 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.649935 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.650009 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.650034 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.650069 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.650095 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753276 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753355 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753409 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.753431 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856402 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856444 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856458 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.856468 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958536 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958569 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958579 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958591 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:07 crc kubenswrapper[4816]: I0311 12:00:07.958600 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:07Z","lastTransitionTime":"2026-03-11T12:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061078 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061108 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061117 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.061166 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163602 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163665 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163688 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163716 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.163737 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266914 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266962 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266972 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266987 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.266997 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370039 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370103 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370122 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370150 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.370175 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.472901 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.473048 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.473082 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.473118 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.473144 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576215 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576233 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576290 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.576314 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678697 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678760 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.678769 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782496 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782548 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782562 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782579 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.782590 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.811778 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-x2vtk"] Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.812438 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.814282 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.814283 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.815195 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.830446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.841890 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.851367 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.865860 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.885539 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.885742 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.885968 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.885983 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.886005 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.886017 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.899736 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.911963 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.924405 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.932747 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5szhb\" (UniqueName: \"kubernetes.io/projected/6497a90c-3b50-4dba-80d3-085c57f4f567-kube-api-access-5szhb\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.932857 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497a90c-3b50-4dba-80d3-085c57f4f567-hosts-file\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.933916 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.955490 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989198 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989349 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989371 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989405 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:08 crc kubenswrapper[4816]: I0311 12:00:08.989428 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:08Z","lastTransitionTime":"2026-03-11T12:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.033699 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497a90c-3b50-4dba-80d3-085c57f4f567-hosts-file\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.033780 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5szhb\" (UniqueName: \"kubernetes.io/projected/6497a90c-3b50-4dba-80d3-085c57f4f567-kube-api-access-5szhb\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.034098 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6497a90c-3b50-4dba-80d3-085c57f4f567-hosts-file\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.053874 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5szhb\" (UniqueName: \"kubernetes.io/projected/6497a90c-3b50-4dba-80d3-085c57f4f567-kube-api-access-5szhb\") pod \"node-resolver-x2vtk\" (UID: \"6497a90c-3b50-4dba-80d3-085c57f4f567\") " pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092737 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092819 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092842 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.092860 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.129895 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:09 crc kubenswrapper[4816]: E0311 12:00:09.130135 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.130191 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.130390 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:09 crc kubenswrapper[4816]: E0311 12:00:09.131532 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:09 crc kubenswrapper[4816]: E0311 12:00:09.131713 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.137138 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x2vtk" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.173160 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mdbt5"] Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.173513 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zbg7x"] Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.173739 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.179760 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.180028 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.180207 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.181353 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.181680 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.182270 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-b4v82"] Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.182763 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.182842 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.186018 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.188762 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.188830 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.189489 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.189673 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.189839 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.191239 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.193700 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.195937 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.195983 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.195998 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.196020 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.196035 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.209029 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.223836 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.242486 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.252683 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.262653 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.275143 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.286595 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.296174 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310581 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310639 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310654 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.310663 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.318883 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.335958 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fdff21c-644f-4443-a268-f98c91ea120a-proxy-tls\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336030 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-os-release\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336046 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-etc-kubernetes\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336062 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336224 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fdff21c-644f-4443-a268-f98c91ea120a-mcd-auth-proxy-config\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336314 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-system-cni-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336336 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd5cm\" (UniqueName: \"kubernetes.io/projected/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-kube-api-access-gd5cm\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336374 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-system-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336408 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-socket-dir-parent\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336434 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-netns\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336467 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fdff21c-644f-4443-a268-f98c91ea120a-rootfs\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336502 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-kubelet\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336559 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-conf-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336638 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-hostroot\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336678 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336697 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-cni-binary-copy\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-multus-daemon-config\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336732 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-os-release\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336832 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336856 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-bin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336904 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5sxg\" (UniqueName: \"kubernetes.io/projected/a30d3e88-e081-4303-a202-1b7505629539-kube-api-access-q5sxg\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336936 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-cnibin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.336962 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-multus\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.337003 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-k8s-cni-cncf-io\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.337032 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-multus-certs\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.337107 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdgk\" (UniqueName: \"kubernetes.io/projected/7fdff21c-644f-4443-a268-f98c91ea120a-kube-api-access-jqdgk\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.337137 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cnibin\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.339256 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.347978 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.360685 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.371143 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.381751 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.391516 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.402043 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.412294 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416303 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416349 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416360 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416382 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.416403 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.425500 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438418 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cnibin\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438464 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fdff21c-644f-4443-a268-f98c91ea120a-proxy-tls\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438503 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-os-release\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438520 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-etc-kubernetes\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438538 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438539 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cnibin\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438652 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fdff21c-644f-4443-a268-f98c91ea120a-mcd-auth-proxy-config\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438688 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-system-cni-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438709 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd5cm\" (UniqueName: \"kubernetes.io/projected/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-kube-api-access-gd5cm\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438741 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-system-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-socket-dir-parent\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438779 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-netns\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438805 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fdff21c-644f-4443-a268-f98c91ea120a-rootfs\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438824 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-kubelet\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438858 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-conf-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438879 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-hostroot\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438896 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-cni-binary-copy\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438937 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-multus-daemon-config\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438962 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-os-release\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.438988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-bin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5sxg\" (UniqueName: \"kubernetes.io/projected/a30d3e88-e081-4303-a202-1b7505629539-kube-api-access-q5sxg\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439064 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-cnibin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439086 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-multus\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439105 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-k8s-cni-cncf-io\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439128 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-multus-certs\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439157 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdgk\" (UniqueName: \"kubernetes.io/projected/7fdff21c-644f-4443-a268-f98c91ea120a-kube-api-access-jqdgk\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439226 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-binary-copy\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.439855 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440047 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-bin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440699 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-cnibin\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440735 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fdff21c-644f-4443-a268-f98c91ea120a-mcd-auth-proxy-config\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440750 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-cni-multus\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-k8s-cni-cncf-io\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440791 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-system-cni-dir\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440810 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-multus-certs\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440863 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-os-release\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.440941 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-cni-binary-copy\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441084 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-system-cni-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441143 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-socket-dir-parent\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441180 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-run-netns\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441217 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fdff21c-644f-4443-a268-f98c91ea120a-rootfs\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441295 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-host-var-lib-kubelet\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441335 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-multus-conf-dir\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441367 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-hostroot\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441425 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-os-release\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a30d3e88-e081-4303-a202-1b7505629539-etc-kubernetes\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.441601 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a30d3e88-e081-4303-a202-1b7505629539-multus-daemon-config\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.447613 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.451271 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fdff21c-644f-4443-a268-f98c91ea120a-proxy-tls\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.458687 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.460560 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5sxg\" (UniqueName: \"kubernetes.io/projected/a30d3e88-e081-4303-a202-1b7505629539-kube-api-access-q5sxg\") pod \"multus-mdbt5\" (UID: \"a30d3e88-e081-4303-a202-1b7505629539\") " pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.461566 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdgk\" (UniqueName: \"kubernetes.io/projected/7fdff21c-644f-4443-a268-f98c91ea120a-kube-api-access-jqdgk\") pod \"machine-config-daemon-b4v82\" (UID: \"7fdff21c-644f-4443-a268-f98c91ea120a\") " pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.465436 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd5cm\" (UniqueName: \"kubernetes.io/projected/020fe9c8-a66d-450b-b7b3-b83bcd2bf552-kube-api-access-gd5cm\") pod \"multus-additional-cni-plugins-zbg7x\" (UID: \"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\") " pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.469315 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.477645 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.486262 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.507571 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mdbt5" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.514138 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519152 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519308 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519413 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519516 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519600 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.519930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x2vtk" event={"ID":"6497a90c-3b50-4dba-80d3-085c57f4f567","Type":"ContainerStarted","Data":"9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.520027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x2vtk" event={"ID":"6497a90c-3b50-4dba-80d3-085c57f4f567","Type":"ContainerStarted","Data":"cad11bdd7e68667f7df7a431510b94ff411cd6b51b86a25038a8c406f07c96e3"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.524087 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.524578 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" Mar 11 12:00:09 crc kubenswrapper[4816]: W0311 12:00:09.527310 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda30d3e88_e081_4303_a202_1b7505629539.slice/crio-c5a149483c2fd203a95d61febf5181396381d452588438b940fdacc6f89c4e91 WatchSource:0}: Error finding container c5a149483c2fd203a95d61febf5181396381d452588438b940fdacc6f89c4e91: Status 404 returned error can't find the container with id c5a149483c2fd203a95d61febf5181396381d452588438b940fdacc6f89c4e91 Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.530365 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.536101 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkh2h"] Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.537949 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: W0311 12:00:09.538786 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fdff21c_644f_4443_a268_f98c91ea120a.slice/crio-2d58cf54804a0cc94bf9cfdd8ca1ca7961e514eacea1a5fd5d725dd7aeea2d32 WatchSource:0}: Error finding container 2d58cf54804a0cc94bf9cfdd8ca1ca7961e514eacea1a5fd5d725dd7aeea2d32: Status 404 returned error can't find the container with id 2d58cf54804a0cc94bf9cfdd8ca1ca7961e514eacea1a5fd5d725dd7aeea2d32 Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.540328 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.540481 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.540742 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.540779 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.542485 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.542658 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.544430 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.546764 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.559985 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.576861 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.591508 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.602229 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.614184 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623072 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623105 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623121 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623144 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.623160 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.630602 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640080 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640134 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640165 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640198 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640294 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640323 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640349 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640378 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640423 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640560 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640636 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640670 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640704 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640740 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640774 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640825 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640888 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.640917 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.641886 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.652933 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.671695 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.685446 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.699793 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.712353 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.721105 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726520 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726546 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.726560 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.732570 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742175 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742230 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742292 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742310 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742333 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742354 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742382 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742425 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742449 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742489 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742509 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742550 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742577 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742641 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742903 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742941 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742963 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.742987 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.743626 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.743959 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744001 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744456 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744503 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744531 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744554 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744577 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.744612 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745344 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745383 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745415 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745441 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.745465 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.748891 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.759179 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.764904 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") pod \"ovnkube-node-dkh2h\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.775392 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.786756 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.796746 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.809859 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.825196 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829230 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829315 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829330 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829359 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.829376 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.838629 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.853405 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.859494 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.873161 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: W0311 12:00:09.877532 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbe3bb6_8bf9_40b5_8f4f_0d136e285528.slice/crio-062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b WatchSource:0}: Error finding container 062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b: Status 404 returned error can't find the container with id 062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.883947 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.899344 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.932549 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.932614 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.932629 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.932651 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:09 crc kubenswrapper[4816]: I0311 12:00:09.933119 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:09Z","lastTransitionTime":"2026-03-11T12:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035869 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035904 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035912 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035925 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.035935 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142387 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142447 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142464 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142485 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.142506 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245593 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245633 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245642 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245656 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.245665 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348694 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348769 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348789 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348824 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.348847 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.452342 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.452621 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.452843 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.452991 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.453130 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.530039 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c" exitCode=0 Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.530431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.530688 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerStarted","Data":"354b5b811c9401fe1d22e136dc2cd35d028058f62d6d104e3fb9da21027e38f6"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.532472 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mdbt5" event={"ID":"a30d3e88-e081-4303-a202-1b7505629539","Type":"ContainerStarted","Data":"cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.532521 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mdbt5" event={"ID":"a30d3e88-e081-4303-a202-1b7505629539","Type":"ContainerStarted","Data":"c5a149483c2fd203a95d61febf5181396381d452588438b940fdacc6f89c4e91"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.535623 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" exitCode=0 Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.535697 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.535728 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.538060 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"fdd18d04896447f4bc152e9c4aaaaefe467b16481b593ffa86a7ed44a9120a06"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.538103 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.538116 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"2d58cf54804a0cc94bf9cfdd8ca1ca7961e514eacea1a5fd5d725dd7aeea2d32"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.547013 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555522 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555562 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555573 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555591 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.555603 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.562966 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.576074 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.628952 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.643629 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661487 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661530 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661542 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661560 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.661574 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.662785 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.680677 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.698695 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.712324 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.733570 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.747687 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.759819 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764204 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764220 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764268 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.764284 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.767970 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.780686 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.790704 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.808434 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.820373 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.830225 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.840452 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.858552 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866814 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866853 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866887 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866904 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.866915 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.868831 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.881792 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.902494 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.923459 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.956167 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.968946 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.968981 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.968995 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.969010 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.969021 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:10Z","lastTransitionTime":"2026-03-11T12:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.980599 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:10 crc kubenswrapper[4816]: I0311 12:00:10.994028 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.009558 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd18d04896447f4bc152e9c4aaaaefe467b16481b593ffa86a7ed44a9120a06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072850 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072900 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072913 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072932 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.072942 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.129881 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.129946 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.129973 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:11 crc kubenswrapper[4816]: E0311 12:00:11.131018 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.131323 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 12:00:11 crc kubenswrapper[4816]: E0311 12:00:11.131444 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:11 crc kubenswrapper[4816]: E0311 12:00:11.131504 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:11 crc kubenswrapper[4816]: E0311 12:00:11.131603 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176094 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176162 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176179 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176205 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.176222 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279709 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279778 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279787 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279802 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.279830 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382472 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382524 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382555 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.382567 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485692 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485738 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485747 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485768 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.485782 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.558717 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerStarted","Data":"9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.561948 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.561995 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.562007 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.562017 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.567080 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"03089bdc2168c6be2aaace4e060f2b5242a0fd983ee808f6e61f5d7722767c13"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.567133 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4241b51ce8ec4c60ab9d9911594f165da55eead854b434a39dd6fd18002ba112"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.578695 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588465 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588475 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588495 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.588508 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.593810 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.609954 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.619605 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x2vtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6497a90c-3b50-4dba-80d3-085c57f4f567\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1d5c4f57d4820ed42117939aca9f75eaece467e386f188a336edeb0c931401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x2vtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.627894 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fdff21c-644f-4443-a268-f98c91ea120a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd18d04896447f4bc152e9c4aaaaefe467b16481b593ffa86a7ed44a9120a06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqdgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b4v82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.645743 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8dcc321-50db-4df1-b303-54dd8f895f54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63fd6507da13452b1667cf5fe5f86b72ef07a8c9a57ff43d74684276a8d0633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a349dc3366c99110d521b2e0c35464b1744fada92c1c29aa973b2f60e65cc5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6cfddc1aebbcb43615aafa5620bc9fa877464b85e77a3df5367a6e93c3aa066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768d90b24f08a77bfeb4a1c0540c799d91922b7e872b166c8be18799bd274aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee54943851f65b50e55ee3f0da95763307da8b51fefd2a8b83985ec43c4e15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f71025c079395f8d4addd4014b4f26f907160284b902b119be08da329eec418f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc48525cefc43b0355c1e3d3c24c807755e603797651810c54c8453cdc88da0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://036ecc68973976bcbb1f4df6a3558e9c2606a519d39fb654b86061e0ef78d5a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.657000 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611e39ed4e5d7fcae87f81c718ae7237dfb72a021a40c1fe2df5131b6045a550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.667548 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.676395 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.687329 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690882 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690940 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690952 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690978 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.690993 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.701259 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.712785 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.735270 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.748917 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.756500 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d80c03e-d5ef-48a2-9464-5e64b20f225e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf21a7ab5fa8cc53f0b72a0a78d73e04bbd62a213f193124a0a00b4512b022c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df758ccff81c0ef52d954903c58a6d8e01ce1498d2f4a45057799fc584c70887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.769142 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a8570a1-8304-4344-ac73-7346c594a222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T11:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T11:59:41Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 11:59:40.721716 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 11:59:40.721954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 11:59:40.723455 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-319748104/tls.crt::/tmp/serving-cert-319748104/tls.key\\\\\\\"\\\\nI0311 11:59:40.983979 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 11:59:40.987746 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 11:59:40.987769 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 11:59:40.987794 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 11:59:40.987801 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 11:59:40.995483 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 11:59:40.995533 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995543 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 11:59:40.995556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0311 11:59:40.995552 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 11:59:40.995565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 11:59:40.995592 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 11:59:40.995600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 11:59:40.996695 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T11:59:40Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T11:58:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T11:58:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T11:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T11:58:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.779449 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03089bdc2168c6be2aaace4e060f2b5242a0fd983ee808f6e61f5d7722767c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4241b51ce8ec4c60ab9d9911594f165da55eead854b434a39dd6fd18002ba112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.787642 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.795807 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.795927 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.796009 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.796077 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.796159 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.797138 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.807480 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"020fe9c8-a66d-450b-b7b3-b83bcd2bf552\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4bcc8ea702ae3bb0e9bb811442af19ea9db3e9a71ab3b143ecbd1c4e0ddc27c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gd5cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zbg7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.816848 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T11:59:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.828098 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mdbt5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a30d3e88-e081-4303-a202-1b7505629539\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q5sxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mdbt5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.845398 4816 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T12:00:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T12:00:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T12:00:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj5rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T12:00:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-dkh2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899386 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899428 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899439 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899461 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.899472 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:11Z","lastTransitionTime":"2026-03-11T12:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:11 crc kubenswrapper[4816]: I0311 12:00:11.923223 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.923206298 podStartE2EDuration="12.923206298s" podCreationTimestamp="2026-03-11 11:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:11.920229993 +0000 UTC m=+98.511493960" watchObservedRunningTime="2026-03-11 12:00:11.923206298 +0000 UTC m=+98.514470265" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002809 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002862 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002877 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002897 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.002911 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.044553 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podStartSLOduration=52.044520909 podStartE2EDuration="52.044520909s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:12.044364815 +0000 UTC m=+98.635628792" watchObservedRunningTime="2026-03-11 12:00:12.044520909 +0000 UTC m=+98.635784876" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.044751 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-x2vtk" podStartSLOduration=52.044747316 podStartE2EDuration="52.044747316s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:12.005130292 +0000 UTC m=+98.596394299" watchObservedRunningTime="2026-03-11 12:00:12.044747316 +0000 UTC m=+98.636011283" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105814 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105882 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105894 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105915 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.105927 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209477 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209551 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209569 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.209580 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.222275 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bwrxd"] Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.223282 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.226089 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.226146 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.226602 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.226616 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.281116 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mdbt5" podStartSLOduration=52.281095368 podStartE2EDuration="52.281095368s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:12.280514022 +0000 UTC m=+98.871778009" watchObservedRunningTime="2026-03-11 12:00:12.281095368 +0000 UTC m=+98.872359345" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312324 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312352 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312363 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312378 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.312389 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.368896 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=9.36887507 podStartE2EDuration="9.36887507s" podCreationTimestamp="2026-03-11 12:00:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:12.32135751 +0000 UTC m=+98.912621487" watchObservedRunningTime="2026-03-11 12:00:12.36887507 +0000 UTC m=+98.960139057" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.373807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrt84\" (UniqueName: \"kubernetes.io/projected/6d97dc61-2b0e-413c-942a-b86cb01f20a1-kube-api-access-xrt84\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.373875 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d97dc61-2b0e-413c-942a-b86cb01f20a1-host\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.373905 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d97dc61-2b0e-413c-942a-b86cb01f20a1-serviceca\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415361 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415420 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415430 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415451 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.415466 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.425679 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655"] Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.426147 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.448592 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-tt4rv"] Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.449280 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.449379 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.454764 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.474782 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.475208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrt84\" (UniqueName: \"kubernetes.io/projected/6d97dc61-2b0e-413c-942a-b86cb01f20a1-kube-api-access-xrt84\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.475360 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d97dc61-2b0e-413c-942a-b86cb01f20a1-host\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.475444 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d97dc61-2b0e-413c-942a-b86cb01f20a1-serviceca\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.475673 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d97dc61-2b0e-413c-942a-b86cb01f20a1-host\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.476742 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d97dc61-2b0e-413c-942a-b86cb01f20a1-serviceca\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.517997 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.518305 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.518393 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.518541 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.518628 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.546784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrt84\" (UniqueName: \"kubernetes.io/projected/6d97dc61-2b0e-413c-942a-b86cb01f20a1-kube-api-access-xrt84\") pod \"node-ca-bwrxd\" (UID: \"6d97dc61-2b0e-413c-942a-b86cb01f20a1\") " pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.575148 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56" exitCode=0 Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.575298 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"9783b2ea2ca98c9bef532d998d57e59ecf703e439d61a21baefc82ccd5937a56"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576272 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576391 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576515 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pmjp\" (UniqueName: \"kubernetes.io/projected/91b59d67-b771-4a57-b2a8-84303ec4d9bd-kube-api-access-8pmjp\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576597 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qh5\" (UniqueName: \"kubernetes.io/projected/09dd02d0-be8a-4c51-9dfd-d601d05cd866-kube-api-access-l8qh5\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.576632 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.583801 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.583963 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621786 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621821 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621829 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621845 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.621857 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677415 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pmjp\" (UniqueName: \"kubernetes.io/projected/91b59d67-b771-4a57-b2a8-84303ec4d9bd-kube-api-access-8pmjp\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677466 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qh5\" (UniqueName: \"kubernetes.io/projected/09dd02d0-be8a-4c51-9dfd-d601d05cd866-kube-api-access-l8qh5\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.677713 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.677845 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.677928 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:13.177911383 +0000 UTC m=+99.769175350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.678357 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.679432 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724782 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724836 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724848 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724865 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.724875 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.730775 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qh5\" (UniqueName: \"kubernetes.io/projected/09dd02d0-be8a-4c51-9dfd-d601d05cd866-kube-api-access-l8qh5\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.730832 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09dd02d0-be8a-4c51-9dfd-d601d05cd866-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xh655\" (UID: \"09dd02d0-be8a-4c51-9dfd-d601d05cd866\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.732578 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pmjp\" (UniqueName: \"kubernetes.io/projected/91b59d67-b771-4a57-b2a8-84303ec4d9bd-kube-api-access-8pmjp\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.738820 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828359 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828399 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828418 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.828431 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:12 crc kubenswrapper[4816]: W0311 12:00:12.831639 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09dd02d0_be8a_4c51_9dfd_d601d05cd866.slice/crio-b02f005abc394542d68788747b373edeeaca50785e5d5b93d39bbf1ce9bf3293 WatchSource:0}: Error finding container b02f005abc394542d68788747b373edeeaca50785e5d5b93d39bbf1ce9bf3293: Status 404 returned error can't find the container with id b02f005abc394542d68788747b373edeeaca50785e5d5b93d39bbf1ce9bf3293 Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.839182 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bwrxd" Mar 11 12:00:12 crc kubenswrapper[4816]: W0311 12:00:12.873614 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d97dc61_2b0e_413c_942a_b86cb01f20a1.slice/crio-646d564ba5b1eb8a60d055c2aa0b71a965805d0f9bd8f69c9e444c3b7a9ecfdd WatchSource:0}: Error finding container 646d564ba5b1eb8a60d055c2aa0b71a965805d0f9bd8f69c9e444c3b7a9ecfdd: Status 404 returned error can't find the container with id 646d564ba5b1eb8a60d055c2aa0b71a965805d0f9bd8f69c9e444c3b7a9ecfdd Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.879089 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.879549 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.879513732 +0000 UTC m=+115.470777729 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.881456 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.881549 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.881637 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.881729 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882017 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882070 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882098 4816 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882208 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.882183808 +0000 UTC m=+115.473447815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882345 4816 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882374 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882397 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.882385454 +0000 UTC m=+115.473649421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882406 4816 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882428 4816 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882441 4816 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882466 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.882458136 +0000 UTC m=+115.473722103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 12:00:12 crc kubenswrapper[4816]: E0311 12:00:12.882495 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.882471006 +0000 UTC m=+115.473735013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933372 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933425 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933440 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933463 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:12 crc kubenswrapper[4816]: I0311 12:00:12.933480 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:12Z","lastTransitionTime":"2026-03-11T12:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038432 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038510 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038525 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038596 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.038611 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.129609 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.130050 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.129707 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.130162 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.129624 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.130209 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.140985 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.141212 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.141333 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.141411 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.141498 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.184618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.184803 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:13 crc kubenswrapper[4816]: E0311 12:00:13.184877 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:14.184860679 +0000 UTC m=+100.776124646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245054 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245100 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245113 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245130 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.245140 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351535 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351640 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351672 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351713 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.351740 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454446 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454510 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454533 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454567 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.454586 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557391 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557423 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557435 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557451 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.557462 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.589308 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="e4827fb1c91db3692da5430b5d9b64c1e1fb86fb9225c92506d9d7149ce77fd8" exitCode=0 Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.589372 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"e4827fb1c91db3692da5430b5d9b64c1e1fb86fb9225c92506d9d7149ce77fd8"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.591667 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" event={"ID":"09dd02d0-be8a-4c51-9dfd-d601d05cd866","Type":"ContainerStarted","Data":"76cb1fef8f63512b1532dedbed8b375110bb0e659911c9a42366bb82778856cc"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.591719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" event={"ID":"09dd02d0-be8a-4c51-9dfd-d601d05cd866","Type":"ContainerStarted","Data":"e59a45203ea1cde69f67b68fd8d17d80c31eb16d0008cfce0f5a04c13e1dc1b1"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.591731 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" event={"ID":"09dd02d0-be8a-4c51-9dfd-d601d05cd866","Type":"ContainerStarted","Data":"b02f005abc394542d68788747b373edeeaca50785e5d5b93d39bbf1ce9bf3293"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.595822 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ea705061b843fcacd713f916734a7471b25708babd9c3064fdb81c43ca0e292e"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.598486 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bwrxd" event={"ID":"6d97dc61-2b0e-413c-942a-b86cb01f20a1","Type":"ContainerStarted","Data":"e98ffb21cd666e1b8334dae28dfc28dfcea09c353ad005c552c43a056fa9bb35"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.598558 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bwrxd" event={"ID":"6d97dc61-2b0e-413c-942a-b86cb01f20a1","Type":"ContainerStarted","Data":"646d564ba5b1eb8a60d055c2aa0b71a965805d0f9bd8f69c9e444c3b7a9ecfdd"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.634505 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bwrxd" podStartSLOduration=53.634463514 podStartE2EDuration="53.634463514s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:13.633004872 +0000 UTC m=+100.224268839" watchObservedRunningTime="2026-03-11 12:00:13.634463514 +0000 UTC m=+100.225727511" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661202 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661219 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661286 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.661308 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768271 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768309 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768321 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768337 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.768350 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871027 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871070 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871080 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871096 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.871107 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974046 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974092 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974105 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974120 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:13 crc kubenswrapper[4816]: I0311 12:00:13.974132 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:13Z","lastTransitionTime":"2026-03-11T12:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077403 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077434 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077445 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077457 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.077467 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:14Z","lastTransitionTime":"2026-03-11T12:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.130472 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:14 crc kubenswrapper[4816]: E0311 12:00:14.130604 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180056 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180091 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180101 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180114 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.180124 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:14Z","lastTransitionTime":"2026-03-11T12:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.217557 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:14 crc kubenswrapper[4816]: E0311 12:00:14.217698 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:14 crc kubenswrapper[4816]: E0311 12:00:14.217776 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:16.217757083 +0000 UTC m=+102.809021060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283025 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283117 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283143 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283180 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.283207 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:14Z","lastTransitionTime":"2026-03-11T12:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355401 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355439 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355450 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355465 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.355478 4816 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T12:00:14Z","lastTransitionTime":"2026-03-11T12:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.419750 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xh655" podStartSLOduration=54.419732132 podStartE2EDuration="54.419732132s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:13.659458208 +0000 UTC m=+100.250722175" watchObservedRunningTime="2026-03-11 12:00:14.419732132 +0000 UTC m=+101.010996099" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.420591 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6"] Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.421123 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.424464 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.424596 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.426157 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.426344 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520775 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e52741-8cc7-4b62-8b75-5cae7f35a099-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520889 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e52741-8cc7-4b62-8b75-5cae7f35a099-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520963 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.520996 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e52741-8cc7-4b62-8b75-5cae7f35a099-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.606372 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.608932 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="ec01c00f45917fb96ae9b2d32ebde4f2cc28a9e248f785ef48ad05897d866083" exitCode=0 Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.609062 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"ec01c00f45917fb96ae9b2d32ebde4f2cc28a9e248f785ef48ad05897d866083"} Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621463 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e52741-8cc7-4b62-8b75-5cae7f35a099-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621542 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621568 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e52741-8cc7-4b62-8b75-5cae7f35a099-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621609 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621629 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e52741-8cc7-4b62-8b75-5cae7f35a099-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621683 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.621865 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f2e52741-8cc7-4b62-8b75-5cae7f35a099-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.622602 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2e52741-8cc7-4b62-8b75-5cae7f35a099-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.636096 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e52741-8cc7-4b62-8b75-5cae7f35a099-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.643867 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e52741-8cc7-4b62-8b75-5cae7f35a099-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gt7f6\" (UID: \"f2e52741-8cc7-4b62-8b75-5cae7f35a099\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: I0311 12:00:14.738760 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" Mar 11 12:00:14 crc kubenswrapper[4816]: W0311 12:00:14.759666 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e52741_8cc7_4b62_8b75_5cae7f35a099.slice/crio-14d86cbebce3b0c032e715d559da9c520c7f26d0f3fafd903553618ffa2334ab WatchSource:0}: Error finding container 14d86cbebce3b0c032e715d559da9c520c7f26d0f3fafd903553618ffa2334ab: Status 404 returned error can't find the container with id 14d86cbebce3b0c032e715d559da9c520c7f26d0f3fafd903553618ffa2334ab Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.124979 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.129500 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.129522 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.129507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:15 crc kubenswrapper[4816]: E0311 12:00:15.129623 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:15 crc kubenswrapper[4816]: E0311 12:00:15.129734 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:15 crc kubenswrapper[4816]: E0311 12:00:15.129794 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.133409 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.612861 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" event={"ID":"f2e52741-8cc7-4b62-8b75-5cae7f35a099","Type":"ContainerStarted","Data":"96fbe9f8a8110e2f89f83793f4dee33eb351439613fc2f280ac1bef28f6ec76e"} Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.612905 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" event={"ID":"f2e52741-8cc7-4b62-8b75-5cae7f35a099","Type":"ContainerStarted","Data":"14d86cbebce3b0c032e715d559da9c520c7f26d0f3fafd903553618ffa2334ab"} Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.616037 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="69a3bb3f459337a3e6d40ddd15adb16c9d0859a640da085202a485ffb360e548" exitCode=0 Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.616063 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"69a3bb3f459337a3e6d40ddd15adb16c9d0859a640da085202a485ffb360e548"} Mar 11 12:00:15 crc kubenswrapper[4816]: I0311 12:00:15.631757 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gt7f6" podStartSLOduration=55.631742633 podStartE2EDuration="55.631742633s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:15.63060197 +0000 UTC m=+102.221865937" watchObservedRunningTime="2026-03-11 12:00:15.631742633 +0000 UTC m=+102.223006600" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.129898 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:16 crc kubenswrapper[4816]: E0311 12:00:16.130678 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.235540 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:16 crc kubenswrapper[4816]: E0311 12:00:16.235692 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:16 crc kubenswrapper[4816]: E0311 12:00:16.235746 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:20.235731465 +0000 UTC m=+106.826995432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.623438 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerStarted","Data":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.623744 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.623776 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.625889 4816 generic.go:334] "Generic (PLEG): container finished" podID="020fe9c8-a66d-450b-b7b3-b83bcd2bf552" containerID="8e0691bd18a723d9479fcad26099fc1053d7625ff2a03ddd90d7658027238c06" exitCode=0 Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.625928 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerDied","Data":"8e0691bd18a723d9479fcad26099fc1053d7625ff2a03ddd90d7658027238c06"} Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.658182 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podStartSLOduration=56.658141562 podStartE2EDuration="56.658141562s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:16.653032586 +0000 UTC m=+103.244296553" watchObservedRunningTime="2026-03-11 12:00:16.658141562 +0000 UTC m=+103.249405529" Mar 11 12:00:16 crc kubenswrapper[4816]: I0311 12:00:16.671748 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.129909 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:17 crc kubenswrapper[4816]: E0311 12:00:17.130150 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.130318 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:17 crc kubenswrapper[4816]: E0311 12:00:17.130417 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.130503 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:17 crc kubenswrapper[4816]: E0311 12:00:17.130718 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.634823 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" event={"ID":"020fe9c8-a66d-450b-b7b3-b83bcd2bf552","Type":"ContainerStarted","Data":"abde7c4c71693e6850ace298ecc3b9148ac00443be5c61daf3e4b93bc817d793"} Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.635328 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.671065 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zbg7x" podStartSLOduration=57.671044474 podStartE2EDuration="57.671044474s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:17.670316553 +0000 UTC m=+104.261580550" watchObservedRunningTime="2026-03-11 12:00:17.671044474 +0000 UTC m=+104.262308441" Mar 11 12:00:17 crc kubenswrapper[4816]: I0311 12:00:17.705699 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:18 crc kubenswrapper[4816]: I0311 12:00:18.129638 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:18 crc kubenswrapper[4816]: E0311 12:00:18.129762 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:18 crc kubenswrapper[4816]: I0311 12:00:18.519381 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tt4rv"] Mar 11 12:00:18 crc kubenswrapper[4816]: I0311 12:00:18.637659 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:18 crc kubenswrapper[4816]: E0311 12:00:18.638435 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:19 crc kubenswrapper[4816]: I0311 12:00:19.129961 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:19 crc kubenswrapper[4816]: I0311 12:00:19.130003 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:19 crc kubenswrapper[4816]: I0311 12:00:19.130097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:19 crc kubenswrapper[4816]: E0311 12:00:19.130208 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 12:00:19 crc kubenswrapper[4816]: E0311 12:00:19.130353 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 12:00:19 crc kubenswrapper[4816]: E0311 12:00:19.130607 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 12:00:20 crc kubenswrapper[4816]: I0311 12:00:20.130565 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:20 crc kubenswrapper[4816]: E0311 12:00:20.131120 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tt4rv" podUID="91b59d67-b771-4a57-b2a8-84303ec4d9bd" Mar 11 12:00:20 crc kubenswrapper[4816]: I0311 12:00:20.284011 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:20 crc kubenswrapper[4816]: E0311 12:00:20.284176 4816 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:20 crc kubenswrapper[4816]: E0311 12:00:20.284231 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs podName:91b59d67-b771-4a57-b2a8-84303ec4d9bd nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.284216247 +0000 UTC m=+114.875480214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs") pod "network-metrics-daemon-tt4rv" (UID: "91b59d67-b771-4a57-b2a8-84303ec4d9bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 12:00:20 crc kubenswrapper[4816]: I0311 12:00:20.988723 4816 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 11 12:00:20 crc kubenswrapper[4816]: I0311 12:00:20.988851 4816 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.028450 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pjsgk"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.029375 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t5t6b"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.029523 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.029819 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.030368 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.030796 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.031231 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.031823 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.032307 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.032592 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.035482 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ct9ss"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.035786 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.037063 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.037153 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.037857 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.039084 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.039610 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.039670 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.039889 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.040016 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.042818 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.043068 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.045178 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.046262 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.046903 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.050195 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.051946 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.052208 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.056566 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.057065 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.057621 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.058508 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.059104 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.059623 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.059651 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.063572 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.059786 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079225 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079325 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079332 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079459 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.079682 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081050 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081129 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081192 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081338 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081343 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081144 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.081600 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.083359 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dh658"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.084360 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.088643 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.088842 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.089615 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.090201 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.091957 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092585 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092648 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092677 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-encryption-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092700 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t2q6\" (UniqueName: \"kubernetes.io/projected/24bf5f7b-1059-487a-95e7-ab72af29801e-kube-api-access-7t2q6\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092726 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092751 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092804 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-client\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-images\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092847 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092871 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092893 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24bf5f7b-1059-487a-95e7-ab72af29801e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092916 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-client\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit-dir\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092965 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.092987 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5wr\" (UniqueName: \"kubernetes.io/projected/dbdb4690-7503-43ee-9e26-34af04f30235-kube-api-access-fh5wr\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-image-import-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093101 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093129 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093171 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093194 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6fz\" (UniqueName: \"kubernetes.io/projected/7ec67c73-6257-41dc-b848-ba547368c957-kube-api-access-mj6fz\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093237 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-service-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093286 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-dir\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093311 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq894\" (UniqueName: \"kubernetes.io/projected/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-kube-api-access-sq894\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093333 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093352 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-node-pullsecrets\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093370 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj7ng\" (UniqueName: \"kubernetes.io/projected/17c97aa5-8179-41d7-adcb-c4da341f4cec-kube-api-access-hj7ng\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093413 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-config\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47r6l\" (UniqueName: \"kubernetes.io/projected/8c843417-3e01-48f9-b0b6-845fbbbf7eab-kube-api-access-47r6l\") pod \"downloads-7954f5f757-dh658\" (UID: \"8c843417-3e01-48f9-b0b6-845fbbbf7eab\") " pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093474 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz99z\" (UniqueName: \"kubernetes.io/projected/3af1f0c3-1a92-49f9-beec-dff95561c5dd-kube-api-access-dz99z\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093493 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093535 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-policies\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093565 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093585 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cqxl\" (UniqueName: \"kubernetes.io/projected/cf7eaa86-2d32-4321-9016-e785320de3e2-kube-api-access-8cqxl\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093604 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-serving-cert\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093645 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec67c73-6257-41dc-b848-ba547368c957-serving-cert\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093666 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093684 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-encryption-config\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093704 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-config\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093735 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dbdb4690-7503-43ee-9e26-34af04f30235-machine-approver-tls\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093756 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093775 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7eaa86-2d32-4321-9016-e785320de3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093795 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-auth-proxy-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093818 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093838 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093895 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-serving-cert\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093916 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093945 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093967 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.093989 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094032 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094054 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094073 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094486 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.094602 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.095014 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.097988 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098186 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098380 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098557 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098665 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098711 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098758 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098925 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.098940 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.099267 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.099357 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.099315 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.099547 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.110372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.110605 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.110747 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.110873 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.111036 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.112677 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.112859 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.112996 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.113129 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.111321 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.112678 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.114124 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.114278 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.115071 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.115774 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fxsjj"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.116154 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.116505 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.117711 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.117954 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.118110 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.118568 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.118953 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.119076 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.120524 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.149040 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.149896 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.151553 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.151584 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.151742 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.152686 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.153096 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.155088 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.155627 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mzkr9"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.156150 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.156644 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rft5w"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.156743 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.157440 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.162131 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.162498 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.162564 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.162670 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.170644 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.170970 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.170990 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.171138 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.171326 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.172011 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.172184 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.174752 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.174977 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.177010 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.177235 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.177664 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.177916 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.178080 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.178696 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.178946 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.182374 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.185209 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.185473 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.185649 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.185807 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186039 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186266 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186435 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186603 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186794 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.186946 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187025 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187324 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187368 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187603 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.187691 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.188079 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.188227 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.189428 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.193334 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.193970 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.194100 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.194791 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.194992 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195064 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l762\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-kube-api-access-6l762\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195102 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195130 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195795 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195820 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f66d48af-027e-448b-9897-9f0c62fbd6c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195861 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195882 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195900 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195931 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-encryption-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t2q6\" (UniqueName: \"kubernetes.io/projected/24bf5f7b-1059-487a-95e7-ab72af29801e-kube-api-access-7t2q6\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.195982 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196004 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196026 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196046 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-client\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196071 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-images\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196109 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24bf5f7b-1059-487a-95e7-ab72af29801e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196135 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196165 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-client\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196186 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh5wr\" (UniqueName: \"kubernetes.io/projected/dbdb4690-7503-43ee-9e26-34af04f30235-kube-api-access-fh5wr\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196239 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-image-import-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196273 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit-dir\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196293 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196312 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196334 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196358 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196378 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6fz\" (UniqueName: \"kubernetes.io/projected/7ec67c73-6257-41dc-b848-ba547368c957-kube-api-access-mj6fz\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196400 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-service-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196425 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-dir\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196451 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq894\" (UniqueName: \"kubernetes.io/projected/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-kube-api-access-sq894\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196492 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-node-pullsecrets\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196926 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj7ng\" (UniqueName: \"kubernetes.io/projected/17c97aa5-8179-41d7-adcb-c4da341f4cec-kube-api-access-hj7ng\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196942 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-config\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.196988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47r6l\" (UniqueName: \"kubernetes.io/projected/8c843417-3e01-48f9-b0b6-845fbbbf7eab-kube-api-access-47r6l\") pod \"downloads-7954f5f757-dh658\" (UID: \"8c843417-3e01-48f9-b0b6-845fbbbf7eab\") " pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197012 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197032 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197051 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197076 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-policies\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197093 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197112 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz99z\" (UniqueName: \"kubernetes.io/projected/3af1f0c3-1a92-49f9-beec-dff95561c5dd-kube-api-access-dz99z\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197134 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cqxl\" (UniqueName: \"kubernetes.io/projected/cf7eaa86-2d32-4321-9016-e785320de3e2-kube-api-access-8cqxl\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197153 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-serving-cert\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec67c73-6257-41dc-b848-ba547368c957-serving-cert\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197189 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-encryption-config\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197225 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-config\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197267 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dbdb4690-7503-43ee-9e26-34af04f30235-machine-approver-tls\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197309 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197328 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7eaa86-2d32-4321-9016-e785320de3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197359 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-auth-proxy-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197379 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197401 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197491 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197570 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-serving-cert\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197639 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197696 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f66d48af-027e-448b-9897-9f0c62fbd6c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197745 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-image-import-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.197813 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit-dir\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.198539 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.199418 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.199758 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.200774 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.201330 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.201742 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.202237 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.202449 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.205692 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.205763 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.205895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.206839 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-images\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.207345 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.207702 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.208513 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.208898 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.209068 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.209518 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.209546 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.209696 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211026 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211077 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.210419 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-config\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211336 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-encryption-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec67c73-6257-41dc-b848-ba547368c957-service-ca-bundle\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.211860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-dir\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.212526 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.212588 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.212887 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3af1f0c3-1a92-49f9-beec-dff95561c5dd-node-pullsecrets\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.212964 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-serving-cert\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.213422 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.215091 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-serving-ca\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.215207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.215282 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbdb4690-7503-43ee-9e26-34af04f30235-auth-proxy-config\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.216042 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1f0c3-1a92-49f9-beec-dff95561c5dd-etcd-client\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.216518 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.217868 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.217880 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-config\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.218671 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t5t6b"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.218724 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6m5gg"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.218974 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-etcd-client\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.219382 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.219971 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.220432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.221832 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7eaa86-2d32-4321-9016-e785320de3e2-config\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.221893 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.222776 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.224457 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24bf5f7b-1059-487a-95e7-ab72af29801e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.224624 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.225851 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3af1f0c3-1a92-49f9-beec-dff95561c5dd-audit\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.225937 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf7eaa86-2d32-4321-9016-e785320de3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.226466 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17c97aa5-8179-41d7-adcb-c4da341f4cec-audit-policies\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.226620 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.227090 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec67c73-6257-41dc-b848-ba547368c957-serving-cert\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.227471 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pjsgk"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.227677 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.227744 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.242154 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.242230 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-encryption-config\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.242640 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dbdb4690-7503-43ee-9e26-34af04f30235-machine-approver-tls\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.242747 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.243158 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.243169 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c97aa5-8179-41d7-adcb-c4da341f4cec-serving-cert\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.243606 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.244222 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.246790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.249305 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.251283 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-28g7h"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.252880 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.253670 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.254716 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.254956 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zln7t"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.259066 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.261870 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tqt25"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.263424 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.266903 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wgxgk"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.270108 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.272495 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.278938 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.279577 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.280303 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.280402 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.281199 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.281317 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282009 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282134 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282503 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282763 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282893 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dh658"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282920 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.282999 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.283558 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.284823 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.285859 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ct9ss"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.286773 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fxsjj"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.287718 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.288645 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.289770 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2ltv9"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.290353 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.290862 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bb6wh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.291528 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.292202 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rft5w"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.292367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.292639 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.293625 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.294621 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zln7t"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.295621 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mzkr9"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.296584 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.297533 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.298637 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.298975 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f66d48af-027e-448b-9897-9f0c62fbd6c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.299010 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l762\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-kube-api-access-6l762\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.299033 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f66d48af-027e-448b-9897-9f0c62fbd6c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.299050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.299621 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.300566 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.301494 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.302388 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.303259 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ltv9"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.304309 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.306283 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.307182 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mws5d"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.307920 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.308173 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgxgk"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.309218 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-28g7h"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.310142 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.311062 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tqt25"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.312168 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.312269 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.313045 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.313996 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bb6wh"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.315178 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.316174 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.317068 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6bx5p"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.317725 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.318043 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.318952 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6"] Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.332224 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.352728 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.372459 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.391821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.412228 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.432534 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.452998 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.481368 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.512732 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.532163 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.552423 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.572302 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.585488 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f66d48af-027e-448b-9897-9f0c62fbd6c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.592180 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.620016 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.632187 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f66d48af-027e-448b-9897-9f0c62fbd6c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.634638 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.653510 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.692165 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.711396 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.731119 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.751854 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.790872 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t2q6\" (UniqueName: \"kubernetes.io/projected/24bf5f7b-1059-487a-95e7-ab72af29801e-kube-api-access-7t2q6\") pod \"cluster-samples-operator-665b6dd947-gwvvh\" (UID: \"24bf5f7b-1059-487a-95e7-ab72af29801e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.792481 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.812622 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.833544 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.852092 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.873608 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.891495 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.926387 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") pod \"oauth-openshift-558db77b4-bz2pp\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.933024 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 12:00:21 crc kubenswrapper[4816]: I0311 12:00:21.951408 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.036181 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.041733 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.041987 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.042980 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.044219 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.072648 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.073241 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.076370 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh5wr\" (UniqueName: \"kubernetes.io/projected/dbdb4690-7503-43ee-9e26-34af04f30235-kube-api-access-fh5wr\") pod \"machine-approver-56656f9798-x5fc4\" (UID: \"dbdb4690-7503-43ee-9e26-34af04f30235\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.093436 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.113284 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.133497 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.133826 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.134218 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.153429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.174978 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.202566 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.214472 4816 request.go:700] Waited for 1.002467244s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.220207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6fz\" (UniqueName: \"kubernetes.io/projected/7ec67c73-6257-41dc-b848-ba547368c957-kube-api-access-mj6fz\") pod \"authentication-operator-69f744f599-ct9ss\" (UID: \"7ec67c73-6257-41dc-b848-ba547368c957\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.240296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") pod \"route-controller-manager-6576b87f9c-cdscr\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.250007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq894\" (UniqueName: \"kubernetes.io/projected/d2d35dfe-af6d-4c32-9e06-4650a6b1d52d-kube-api-access-sq894\") pod \"openshift-apiserver-operator-796bbdcf4f-gc7hf\" (UID: \"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:22 crc kubenswrapper[4816]: W0311 12:00:22.256186 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbdb4690_7503_43ee_9e26_34af04f30235.slice/crio-5a73d8356cf7871c6377033faae9b412d33fae165064c165f502eb83d942cb9c WatchSource:0}: Error finding container 5a73d8356cf7871c6377033faae9b412d33fae165064c165f502eb83d942cb9c: Status 404 returned error can't find the container with id 5a73d8356cf7871c6377033faae9b412d33fae165064c165f502eb83d942cb9c Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.270432 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj7ng\" (UniqueName: \"kubernetes.io/projected/17c97aa5-8179-41d7-adcb-c4da341f4cec-kube-api-access-hj7ng\") pod \"apiserver-7bbb656c7d-r2nzn\" (UID: \"17c97aa5-8179-41d7-adcb-c4da341f4cec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.272394 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.300590 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.301608 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.318165 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.321402 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz99z\" (UniqueName: \"kubernetes.io/projected/3af1f0c3-1a92-49f9-beec-dff95561c5dd-kube-api-access-dz99z\") pod \"apiserver-76f77b778f-pjsgk\" (UID: \"3af1f0c3-1a92-49f9-beec-dff95561c5dd\") " pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.321601 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.330183 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cqxl\" (UniqueName: \"kubernetes.io/projected/cf7eaa86-2d32-4321-9016-e785320de3e2-kube-api-access-8cqxl\") pod \"machine-api-operator-5694c8668f-t5t6b\" (UID: \"cf7eaa86-2d32-4321-9016-e785320de3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:22 crc kubenswrapper[4816]: W0311 12:00:22.338208 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f288b8_4b39_42ac_9835_4fb118a86218.slice/crio-b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243 WatchSource:0}: Error finding container b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243: Status 404 returned error can't find the container with id b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243 Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.349594 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47r6l\" (UniqueName: \"kubernetes.io/projected/8c843417-3e01-48f9-b0b6-845fbbbf7eab-kube-api-access-47r6l\") pod \"downloads-7954f5f757-dh658\" (UID: \"8c843417-3e01-48f9-b0b6-845fbbbf7eab\") " pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.371047 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") pod \"controller-manager-879f6c89f-nv429\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.372354 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.383423 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.392605 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.414310 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.433429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.452191 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.471803 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.481931 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.485616 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.490521 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.491989 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.509463 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.511516 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.531903 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.544035 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.551773 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.572963 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.593810 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.596077 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.611144 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.612646 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.632714 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.649610 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ct9ss"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.651961 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: W0311 12:00:22.661186 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec67c73_6257_41dc_b848_ba547368c957.slice/crio-26462429ff0fe4700dd69a2524946294749803dcde6d4034003a12c17da3f2c0 WatchSource:0}: Error finding container 26462429ff0fe4700dd69a2524946294749803dcde6d4034003a12c17da3f2c0: Status 404 returned error can't find the container with id 26462429ff0fe4700dd69a2524946294749803dcde6d4034003a12c17da3f2c0 Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.661661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" event={"ID":"24bf5f7b-1059-487a-95e7-ab72af29801e","Type":"ContainerStarted","Data":"b3518b60616560a1c089330cd245b2abd91a8eed65804675e1dad99f6d5712ce"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.661700 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" event={"ID":"24bf5f7b-1059-487a-95e7-ab72af29801e","Type":"ContainerStarted","Data":"6baeb34cd2b51a35dd0cbca03c47cac50adcd61d70cad0bfc9e72eccc859313c"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.663941 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" event={"ID":"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d","Type":"ContainerStarted","Data":"e69334ec2dd44c8efdfb80df538d49618111790fbe3b01ecae1f260aaba837f2"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.671977 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.683085 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" event={"ID":"17c97aa5-8179-41d7-adcb-c4da341f4cec","Type":"ContainerStarted","Data":"398450a631a14a605dba9c08f23071654e7fd20ee860c55a4bbb4b0c32cdcd51"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.690091 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" event={"ID":"dbdb4690-7503-43ee-9e26-34af04f30235","Type":"ContainerStarted","Data":"8324b07daaf09810f2847aa811a6467d4dd16ffa03ac6260c18053c46bcfa025"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.690140 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" event={"ID":"dbdb4690-7503-43ee-9e26-34af04f30235","Type":"ContainerStarted","Data":"5a73d8356cf7871c6377033faae9b412d33fae165064c165f502eb83d942cb9c"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.693755 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.698299 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" event={"ID":"f0f288b8-4b39-42ac-9835-4fb118a86218","Type":"ContainerStarted","Data":"e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.698471 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.698490 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" event={"ID":"f0f288b8-4b39-42ac-9835-4fb118a86218","Type":"ContainerStarted","Data":"b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.701204 4816 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bz2pp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.701275 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.711654 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.713582 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.714980 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.733969 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.750906 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"964c09610d05fa085a4adc7f7d902f67376a9168848e403cd849cfc2290dc26d"} Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.751614 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.761725 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.771825 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.785335 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.791143 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.812396 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 12:00:22 crc kubenswrapper[4816]: W0311 12:00:22.827165 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef1d29fc_f278_4f20_8362_3c406634d8ff.slice/crio-095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242 WatchSource:0}: Error finding container 095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242: Status 404 returned error can't find the container with id 095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242 Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.833192 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.851464 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.877338 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.887674 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dh658"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.894546 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.915532 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.936696 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.945450 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pjsgk"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.955780 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.971422 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.983708 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t5t6b"] Mar 11 12:00:22 crc kubenswrapper[4816]: I0311 12:00:22.997188 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.016780 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.032043 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.052168 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.078901 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.093037 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.114461 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.133395 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.152491 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.172064 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.192293 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.212645 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.230247 4816 request.go:700] Waited for 1.939600459s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.231741 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.252334 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.271635 4816 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.292070 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.311341 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.350080 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l762\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-kube-api-access-6l762\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.366992 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f66d48af-027e-448b-9897-9f0c62fbd6c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rxljr\" (UID: \"f66d48af-027e-448b-9897-9f0c62fbd6c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.374163 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.392697 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.411850 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.431950 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.492202 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.493273 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.511956 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0503425c-595f-4ff5-a7eb-c73168d939d5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571626 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571663 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0503425c-595f-4ff5-a7eb-c73168d939d5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571692 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb999b74-ac20-4e84-b2c7-b16906afbf06-metrics-tls\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571724 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-config\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571795 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571894 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3678d7-b440-44bd-b73b-2b04f1225094-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571935 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.571985 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-trusted-ca\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572017 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572067 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572097 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rng4c\" (UniqueName: \"kubernetes.io/projected/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-kube-api-access-rng4c\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572158 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572240 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572295 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mdm\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-kube-api-access-p7mdm\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572344 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-serving-cert\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572402 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572440 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572523 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-client\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572550 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgwk\" (UniqueName: \"kubernetes.io/projected/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-kube-api-access-psgwk\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572606 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572637 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-service-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572720 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-serving-cert\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572776 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-serving-cert\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572845 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.572934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3678d7-b440-44bd-b73b-2b04f1225094-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573033 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573104 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfq9m\" (UniqueName: \"kubernetes.io/projected/0503425c-595f-4ff5-a7eb-c73168d939d5-kube-api-access-lfq9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573928 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdnwz\" (UniqueName: \"kubernetes.io/projected/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-kube-api-access-qdnwz\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573963 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlk7\" (UniqueName: \"kubernetes.io/projected/bb999b74-ac20-4e84-b2c7-b16906afbf06-kube-api-access-9dlk7\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.573985 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-config\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.574005 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.575632 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.075611886 +0000 UTC m=+110.666875963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.667358 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr"] Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675118 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675334 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675372 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-certs\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675397 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675419 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675440 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a17171b-c738-4862-a2a0-cbb09219322a-config-volume\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675473 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9c2804-ee65-4a09-9985-d2345aa7f82a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675493 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-apiservice-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-cert\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675547 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgbd4\" (UniqueName: \"kubernetes.io/projected/9a782b5b-9eac-4b5b-8ca8-751111b2459b-kube-api-access-zgbd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675570 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57df17b9-73f2-468a-8359-5a07f19a5493-serving-cert\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675591 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtlst\" (UniqueName: \"kubernetes.io/projected/57df17b9-73f2-468a-8359-5a07f19a5493-kube-api-access-rtlst\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675611 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slmk9\" (UniqueName: \"kubernetes.io/projected/00d6d506-7c84-4fef-9dc9-85f855533c06-kube-api-access-slmk9\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675659 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57df17b9-73f2-468a-8359-5a07f19a5493-config\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675694 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-socket-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675716 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zj2p\" (UniqueName: \"kubernetes.io/projected/ba5682ea-6a62-4983-b525-5dc9612ad46d-kube-api-access-5zj2p\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675741 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgwk\" (UniqueName: \"kubernetes.io/projected/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-kube-api-access-psgwk\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675812 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-stats-auth\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675837 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-default-certificate\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675869 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-service-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675891 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-serving-cert\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675942 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-proxy-tls\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.675967 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-srv-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676006 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676029 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027b1711-77a0-4359-bd98-246217fdb5f8-service-ca-bundle\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676049 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676095 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db49f265-44d3-468b-8e2f-2246b02b57be-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr7ks\" (UniqueName: \"kubernetes.io/projected/2eaac3e7-6f80-47da-a6c7-e415a0b8edbd-kube-api-access-gr7ks\") pod \"migrator-59844c95c7-4kd2n\" (UID: \"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676145 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89cbz\" (UniqueName: \"kubernetes.io/projected/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-kube-api-access-89cbz\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22d5j\" (UniqueName: \"kubernetes.io/projected/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-kube-api-access-22d5j\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a17171b-c738-4862-a2a0-cbb09219322a-metrics-tls\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676278 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676303 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9c2804-ee65-4a09-9985-d2345aa7f82a-config\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676324 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxdsn\" (UniqueName: \"kubernetes.io/projected/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-kube-api-access-nxdsn\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676349 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb999b74-ac20-4e84-b2c7-b16906afbf06-metrics-tls\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676373 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0503425c-595f-4ff5-a7eb-c73168d939d5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676431 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676455 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676478 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3678d7-b440-44bd-b73b-2b04f1225094-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676499 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a782b5b-9eac-4b5b-8ca8-751111b2459b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676526 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-trusted-ca\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676555 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676579 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/680978cb-e609-4292-827f-cc8a5b9c1438-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676602 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676625 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rng4c\" (UniqueName: \"kubernetes.io/projected/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-kube-api-access-rng4c\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676661 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mdm\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-kube-api-access-p7mdm\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676686 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-node-bootstrap-token\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676708 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5p6\" (UniqueName: \"kubernetes.io/projected/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-kube-api-access-6n5p6\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676732 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-serving-cert\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676754 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7wqx\" (UniqueName: \"kubernetes.io/projected/750d6f55-7cf7-4376-8ead-6d481db69c2d-kube-api-access-g7wqx\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676795 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6qxn\" (UniqueName: \"kubernetes.io/projected/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-kube-api-access-g6qxn\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-srv-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676883 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67dd48ce-6361-442d-9552-f06346e4d8d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676959 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.676981 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-plugins-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677002 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnlqr\" (UniqueName: \"kubernetes.io/projected/db49f265-44d3-468b-8e2f-2246b02b57be-kube-api-access-nnlqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677069 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-client\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677090 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-config\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-registration-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677147 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjg5g\" (UniqueName: \"kubernetes.io/projected/027b1711-77a0-4359-bd98-246217fdb5f8-kube-api-access-zjg5g\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67dd48ce-6361-442d-9552-f06346e4d8d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677191 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-images\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677237 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-proxy-tls\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677330 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-serving-cert\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677353 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qfb\" (UniqueName: \"kubernetes.io/projected/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-kube-api-access-z7qfb\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677375 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-cabundle\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677395 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-webhook-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677423 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677448 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a782b5b-9eac-4b5b-8ca8-751111b2459b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677471 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghclh\" (UniqueName: \"kubernetes.io/projected/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-kube-api-access-ghclh\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677512 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3678d7-b440-44bd-b73b-2b04f1225094-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677541 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677567 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677593 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfq9m\" (UniqueName: \"kubernetes.io/projected/0503425c-595f-4ff5-a7eb-c73168d939d5-kube-api-access-lfq9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677619 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677657 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdnwz\" (UniqueName: \"kubernetes.io/projected/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-kube-api-access-qdnwz\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677695 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlk7\" (UniqueName: \"kubernetes.io/projected/bb999b74-ac20-4e84-b2c7-b16906afbf06-kube-api-access-9dlk7\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677722 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-config\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677747 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677770 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9c2804-ee65-4a09-9985-d2345aa7f82a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0503425c-595f-4ff5-a7eb-c73168d939d5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677827 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-profile-collector-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677850 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677890 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-config\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677914 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677952 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00d6d506-7c84-4fef-9dc9-85f855533c06-tmpfs\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-metrics-certs\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.677995 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgkcr\" (UniqueName: \"kubernetes.io/projected/680978cb-e609-4292-827f-cc8a5b9c1438-kube-api-access-bgkcr\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678018 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-key\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678040 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678138 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678162 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd48ce-6361-442d-9552-f06346e4d8d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-csi-data-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678239 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9nvp\" (UniqueName: \"kubernetes.io/projected/1a17171b-c738-4862-a2a0-cbb09219322a-kube-api-access-q9nvp\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678285 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-mountpoint-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678366 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678392 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.678946 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.680341 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.681442 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-service-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.681851 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.682333 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb999b74-ac20-4e84-b2c7-b16906afbf06-metrics-tls\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.682505 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-ca\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.683140 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3678d7-b440-44bd-b73b-2b04f1225094-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.683205 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.183174053 +0000 UTC m=+110.774438110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.683537 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.684427 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-config\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.685067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0503425c-595f-4ff5-a7eb-c73168d939d5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.685792 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-config\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.687049 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-trusted-ca\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.687118 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.687977 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-etcd-client\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.688289 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.688731 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.688935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.689293 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0503425c-595f-4ff5-a7eb-c73168d939d5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.689556 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-serving-cert\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.689677 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.689867 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-serving-cert\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.691740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.692834 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.693536 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3678d7-b440-44bd-b73b-2b04f1225094-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.695868 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.697728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-serving-cert\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.708448 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgwk\" (UniqueName: \"kubernetes.io/projected/c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527-kube-api-access-psgwk\") pod \"etcd-operator-b45778765-rft5w\" (UID: \"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.727818 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.759591 4816 generic.go:334] "Generic (PLEG): container finished" podID="3af1f0c3-1a92-49f9-beec-dff95561c5dd" containerID="8cfbeb80eec0131a6c2a8dc0fdd78c6711bab4e499b7b7166380c7f4003724de" exitCode=0 Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.759843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" event={"ID":"3af1f0c3-1a92-49f9-beec-dff95561c5dd","Type":"ContainerDied","Data":"8cfbeb80eec0131a6c2a8dc0fdd78c6711bab4e499b7b7166380c7f4003724de"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.759883 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" event={"ID":"3af1f0c3-1a92-49f9-beec-dff95561c5dd","Type":"ContainerStarted","Data":"eb951600f906e521d2c215a58ecae086e8409bd458dc9b1ba7e747be45d886e0"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.762746 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" event={"ID":"ef1d29fc-f278-4f20-8362-3c406634d8ff","Type":"ContainerStarted","Data":"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.762789 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" event={"ID":"ef1d29fc-f278-4f20-8362-3c406634d8ff","Type":"ContainerStarted","Data":"095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.762969 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.766702 4816 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-cdscr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.766748 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.767432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dh658" event={"ID":"8c843417-3e01-48f9-b0b6-845fbbbf7eab","Type":"ContainerStarted","Data":"7d4edc05806ccc7dd99c5bfe1808a0dd4314990cfad0ea42ace041972c048777"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.767480 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dh658" event={"ID":"8c843417-3e01-48f9-b0b6-845fbbbf7eab","Type":"ContainerStarted","Data":"db4a27d14ba72b2bcb597e8b4ff67b1e635ed33d4c53964f9c2bd5f7226df206"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.768027 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.770021 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.770069 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.771562 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.772563 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" event={"ID":"dbdb4690-7503-43ee-9e26-34af04f30235","Type":"ContainerStarted","Data":"20f258f941006420dfef84a373d133fce72f2dc844e52d53a66f60e96f528fab"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.774236 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" event={"ID":"564c2921-e9eb-4a24-a5b7-1a8471d1586b","Type":"ContainerStarted","Data":"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.774327 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" event={"ID":"564c2921-e9eb-4a24-a5b7-1a8471d1586b","Type":"ContainerStarted","Data":"306382581adac0ac9b7eb96a682fee969c6c0324fd34514acd435886ca5bcb46"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.774519 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.775825 4816 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nv429 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.775852 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" event={"ID":"f66d48af-027e-448b-9897-9f0c62fbd6c0","Type":"ContainerStarted","Data":"1df3921167d38bf995bc22e8726ca8c5b61c735e26979c6e24183fba2992b175"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.775879 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.778501 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" event={"ID":"cf7eaa86-2d32-4321-9016-e785320de3e2","Type":"ContainerStarted","Data":"0ca81d83bc445fe476353eaa69afe132c69988440474ffaf798ef0516dc80c8d"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.778531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" event={"ID":"cf7eaa86-2d32-4321-9016-e785320de3e2","Type":"ContainerStarted","Data":"4f20efb5ad790d2fc91aac2f36f1b7923395b253a8e689483cd2fcfc5b686b03"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.778543 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" event={"ID":"cf7eaa86-2d32-4321-9016-e785320de3e2","Type":"ContainerStarted","Data":"c8622f1138809cd8e8f26d8decac12051c54c9c0662dfbda911346d18513e58e"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.778582 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlk7\" (UniqueName: \"kubernetes.io/projected/bb999b74-ac20-4e84-b2c7-b16906afbf06-kube-api-access-9dlk7\") pod \"dns-operator-744455d44c-mzkr9\" (UID: \"bb999b74-ac20-4e84-b2c7-b16906afbf06\") " pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22d5j\" (UniqueName: \"kubernetes.io/projected/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-kube-api-access-22d5j\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779071 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a17171b-c738-4862-a2a0-cbb09219322a-metrics-tls\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779104 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9c2804-ee65-4a09-9985-d2345aa7f82a-config\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779168 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxdsn\" (UniqueName: \"kubernetes.io/projected/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-kube-api-access-nxdsn\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779236 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a782b5b-9eac-4b5b-8ca8-751111b2459b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779301 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/680978cb-e609-4292-827f-cc8a5b9c1438-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779345 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-node-bootstrap-token\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779378 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5p6\" (UniqueName: \"kubernetes.io/projected/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-kube-api-access-6n5p6\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779426 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7wqx\" (UniqueName: \"kubernetes.io/projected/750d6f55-7cf7-4376-8ead-6d481db69c2d-kube-api-access-g7wqx\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779477 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67dd48ce-6361-442d-9552-f06346e4d8d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779513 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6qxn\" (UniqueName: \"kubernetes.io/projected/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-kube-api-access-g6qxn\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779543 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-srv-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnlqr\" (UniqueName: \"kubernetes.io/projected/db49f265-44d3-468b-8e2f-2246b02b57be-kube-api-access-nnlqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779658 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779691 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-plugins-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779728 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779784 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-config\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779817 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-registration-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779851 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjg5g\" (UniqueName: \"kubernetes.io/projected/027b1711-77a0-4359-bd98-246217fdb5f8-kube-api-access-zjg5g\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9c2804-ee65-4a09-9985-d2345aa7f82a-config\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67dd48ce-6361-442d-9552-f06346e4d8d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779934 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-images\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.779968 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-proxy-tls\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780069 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-cabundle\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780110 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-webhook-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780147 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qfb\" (UniqueName: \"kubernetes.io/projected/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-kube-api-access-z7qfb\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780180 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghclh\" (UniqueName: \"kubernetes.io/projected/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-kube-api-access-ghclh\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780212 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a782b5b-9eac-4b5b-8ca8-751111b2459b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780333 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9c2804-ee65-4a09-9985-d2345aa7f82a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780387 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-profile-collector-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780415 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780446 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780476 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgkcr\" (UniqueName: \"kubernetes.io/projected/680978cb-e609-4292-827f-cc8a5b9c1438-kube-api-access-bgkcr\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780506 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00d6d506-7c84-4fef-9dc9-85f855533c06-tmpfs\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780534 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-metrics-certs\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780564 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-key\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780593 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780622 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780655 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780686 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd48ce-6361-442d-9552-f06346e4d8d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-csi-data-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780749 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9nvp\" (UniqueName: \"kubernetes.io/projected/1a17171b-c738-4862-a2a0-cbb09219322a-kube-api-access-q9nvp\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780779 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-mountpoint-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780814 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780880 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-certs\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780908 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780938 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a17171b-c738-4862-a2a0-cbb09219322a-config-volume\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.780976 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9c2804-ee65-4a09-9985-d2345aa7f82a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781041 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-apiservice-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-cert\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781224 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slmk9\" (UniqueName: \"kubernetes.io/projected/00d6d506-7c84-4fef-9dc9-85f855533c06-kube-api-access-slmk9\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgbd4\" (UniqueName: \"kubernetes.io/projected/9a782b5b-9eac-4b5b-8ca8-751111b2459b-kube-api-access-zgbd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57df17b9-73f2-468a-8359-5a07f19a5493-serving-cert\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781437 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtlst\" (UniqueName: \"kubernetes.io/projected/57df17b9-73f2-468a-8359-5a07f19a5493-kube-api-access-rtlst\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783197 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57df17b9-73f2-468a-8359-5a07f19a5493-config\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-socket-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783306 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zj2p\" (UniqueName: \"kubernetes.io/projected/ba5682ea-6a62-4983-b525-5dc9612ad46d-kube-api-access-5zj2p\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783344 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783377 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-stats-auth\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783505 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-default-certificate\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783543 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-proxy-tls\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783574 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-srv-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783606 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783661 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027b1711-77a0-4359-bd98-246217fdb5f8-service-ca-bundle\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783696 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783733 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db49f265-44d3-468b-8e2f-2246b02b57be-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783770 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89cbz\" (UniqueName: \"kubernetes.io/projected/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-kube-api-access-89cbz\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.783807 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr7ks\" (UniqueName: \"kubernetes.io/projected/2eaac3e7-6f80-47da-a6c7-e415a0b8edbd-kube-api-access-gr7ks\") pod \"migrator-59844c95c7-4kd2n\" (UID: \"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.784065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-csi-data-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.784235 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-mountpoint-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.784357 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a17171b-c738-4862-a2a0-cbb09219322a-metrics-tls\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.784809 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-plugins-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.785743 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-config\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.782074 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.785954 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-registration-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.786377 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a782b5b-9eac-4b5b-8ca8-751111b2459b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.786880 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787065 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787157 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787309 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00d6d506-7c84-4fef-9dc9-85f855533c06-tmpfs\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-node-bootstrap-token\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787618 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.787830 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.788208 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-images\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.781808 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a782b5b-9eac-4b5b-8ca8-751111b2459b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.788724 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.288704293 +0000 UTC m=+110.879968350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.789176 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.790467 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9c2804-ee65-4a09-9985-d2345aa7f82a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.791954 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57df17b9-73f2-468a-8359-5a07f19a5493-serving-cert\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.792710 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67dd48ce-6361-442d-9552-f06346e4d8d4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.792806 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.793208 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-cert\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.793374 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ba5682ea-6a62-4983-b525-5dc9612ad46d-socket-dir\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.782526 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd48ce-6361-442d-9552-f06346e4d8d4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.793588 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.793938 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-srv-cert\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.794054 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.794954 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.794967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a17171b-c738-4862-a2a0-cbb09219322a-config-volume\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.795572 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57df17b9-73f2-468a-8359-5a07f19a5493-config\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.796634 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/027b1711-77a0-4359-bd98-246217fdb5f8-service-ca-bundle\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.796771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-apiservice-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.797058 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" event={"ID":"7ec67c73-6257-41dc-b848-ba547368c957","Type":"ContainerStarted","Data":"48eaaf8bde0e4a521556bf18eeb616907cc3beb3e02db6e11a51d124d0e2839b"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.797172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" event={"ID":"7ec67c73-6257-41dc-b848-ba547368c957","Type":"ContainerStarted","Data":"26462429ff0fe4700dd69a2524946294749803dcde6d4034003a12c17da3f2c0"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.797598 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.798100 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-default-certificate\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.800481 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00d6d506-7c84-4fef-9dc9-85f855533c06-webhook-cert\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.801106 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-proxy-tls\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.802072 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-cabundle\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.802096 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-proxy-tls\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.802148 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-signing-key\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.802468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-profile-collector-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.803005 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.807071 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/680978cb-e609-4292-827f-cc8a5b9c1438-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.807744 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" event={"ID":"24bf5f7b-1059-487a-95e7-ab72af29801e","Type":"ContainerStarted","Data":"5712611ce3f2b1aa74bbf006f99e5c6f0c92075d2b5f25c92184d1dc6922a9f0"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.807919 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-stats-auth\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.807930 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/750d6f55-7cf7-4376-8ead-6d481db69c2d-srv-cert\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.809384 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-certs\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.809468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdnwz\" (UniqueName: \"kubernetes.io/projected/2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be-kube-api-access-qdnwz\") pod \"openshift-config-operator-7777fb866f-6n4qc\" (UID: \"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.809691 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/027b1711-77a0-4359-bd98-246217fdb5f8-metrics-certs\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.812038 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db49f265-44d3-468b-8e2f-2246b02b57be-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.819866 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" event={"ID":"d2d35dfe-af6d-4c32-9e06-4650a6b1d52d","Type":"ContainerStarted","Data":"e869ba3f6495b26f3b312a5c8e8d5d8453d670e146550e509dd400f901b40af5"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.823160 4816 generic.go:334] "Generic (PLEG): container finished" podID="17c97aa5-8179-41d7-adcb-c4da341f4cec" containerID="fe3c413ed111e88244d59768575fe66a8310b3c2565efff2138e070edf0ec984" exitCode=0 Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.823545 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" event={"ID":"17c97aa5-8179-41d7-adcb-c4da341f4cec","Type":"ContainerDied","Data":"fe3c413ed111e88244d59768575fe66a8310b3c2565efff2138e070edf0ec984"} Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.829326 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") pod \"console-f9d7485db-blgl4\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.851090 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.868663 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mdm\" (UniqueName: \"kubernetes.io/projected/da3678d7-b440-44bd-b73b-2b04f1225094-kube-api-access-p7mdm\") pod \"cluster-image-registry-operator-dc59b4c8b-qwqzv\" (UID: \"da3678d7-b440-44bd-b73b-2b04f1225094\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.887627 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.888635 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.388620962 +0000 UTC m=+110.979884929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.893717 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfq9m\" (UniqueName: \"kubernetes.io/projected/0503425c-595f-4ff5-a7eb-c73168d939d5-kube-api-access-lfq9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-tgbrn\" (UID: \"0503425c-595f-4ff5-a7eb-c73168d939d5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.898465 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.919484 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rng4c\" (UniqueName: \"kubernetes.io/projected/b7e0b0c2-39e9-4aa5-934b-01abfe80d224-kube-api-access-rng4c\") pod \"console-operator-58897d9998-fxsjj\" (UID: \"b7e0b0c2-39e9-4aa5-934b-01abfe80d224\") " pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.952948 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22d5j\" (UniqueName: \"kubernetes.io/projected/4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0-kube-api-access-22d5j\") pod \"olm-operator-6b444d44fb-9znd7\" (UID: \"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.968138 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.973211 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qfb\" (UniqueName: \"kubernetes.io/projected/74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea-kube-api-access-z7qfb\") pod \"machine-config-operator-74547568cd-zdrwx\" (UID: \"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.990144 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:23 crc kubenswrapper[4816]: E0311 12:00:23.993094 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.493081771 +0000 UTC m=+111.084345738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:23 crc kubenswrapper[4816]: I0311 12:00:23.996120 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxdsn\" (UniqueName: \"kubernetes.io/projected/4b4119d5-f1a1-4d09-83c6-da7decba9ab4-kube-api-access-nxdsn\") pod \"machine-config-controller-84d6567774-gm7t5\" (UID: \"4b4119d5-f1a1-4d09-83c6-da7decba9ab4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.009035 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6qxn\" (UniqueName: \"kubernetes.io/projected/fd35e4e1-eb63-44a5-a8e3-376a87c20de2-kube-api-access-g6qxn\") pod \"machine-config-server-mws5d\" (UID: \"fd35e4e1-eb63-44a5-a8e3-376a87c20de2\") " pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.018133 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.024814 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.028704 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnlqr\" (UniqueName: \"kubernetes.io/projected/db49f265-44d3-468b-8e2f-2246b02b57be-kube-api-access-nnlqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ksjm4\" (UID: \"db49f265-44d3-468b-8e2f-2246b02b57be\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.032317 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58756: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.048789 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.066558 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.071941 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.072816 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghclh\" (UniqueName: \"kubernetes.io/projected/1b74d12c-0a8c-48b1-9931-950ea6e20d4a-kube-api-access-ghclh\") pod \"ingress-canary-2ltv9\" (UID: \"1b74d12c-0a8c-48b1-9931-950ea6e20d4a\") " pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.085768 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr7ks\" (UniqueName: \"kubernetes.io/projected/2eaac3e7-6f80-47da-a6c7-e415a0b8edbd-kube-api-access-gr7ks\") pod \"migrator-59844c95c7-4kd2n\" (UID: \"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.097356 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.097998 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.597982513 +0000 UTC m=+111.189246480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.099696 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9nvp\" (UniqueName: \"kubernetes.io/projected/1a17171b-c738-4862-a2a0-cbb09219322a-kube-api-access-q9nvp\") pod \"dns-default-wgxgk\" (UID: \"1a17171b-c738-4862-a2a0-cbb09219322a\") " pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.101255 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rft5w"] Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.123773 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36999e5d-2e84-4f16-8c9f-4a2c40a34cd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k74wh\" (UID: \"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.123893 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" Mar 11 12:00:24 crc kubenswrapper[4816]: W0311 12:00:24.130598 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c986ee_b3e9_4bd1_ae9c_7a70b04e1527.slice/crio-c8662d9af8cd0a9090c14a1f5f335228a3b31c7b8b889472214083ecc6cbeaa4 WatchSource:0}: Error finding container c8662d9af8cd0a9090c14a1f5f335228a3b31c7b8b889472214083ecc6cbeaa4: Status 404 returned error can't find the container with id c8662d9af8cd0a9090c14a1f5f335228a3b31c7b8b889472214083ecc6cbeaa4 Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.132715 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.145214 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58764: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.145344 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.153935 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.154264 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjg5g\" (UniqueName: \"kubernetes.io/projected/027b1711-77a0-4359-bd98-246217fdb5f8-kube-api-access-zjg5g\") pod \"router-default-5444994796-6m5gg\" (UID: \"027b1711-77a0-4359-bd98-246217fdb5f8\") " pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.156868 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67dd48ce-6361-442d-9552-f06346e4d8d4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ksn6f\" (UID: \"67dd48ce-6361-442d-9552-f06346e4d8d4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.161568 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.166497 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.182565 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5p6\" (UniqueName: \"kubernetes.io/projected/7aff6a5d-2a66-4ab5-ad53-878f5fea4115-kube-api-access-6n5p6\") pod \"package-server-manager-789f6589d5-6t4jp\" (UID: \"7aff6a5d-2a66-4ab5-ad53-878f5fea4115\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.195577 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.196184 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv"] Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.198421 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7wqx\" (UniqueName: \"kubernetes.io/projected/750d6f55-7cf7-4376-8ead-6d481db69c2d-kube-api-access-g7wqx\") pod \"catalog-operator-68c6474976-256s6\" (UID: \"750d6f55-7cf7-4376-8ead-6d481db69c2d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.199319 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.199729 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.699716494 +0000 UTC m=+111.290980461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.207766 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.208461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") pod \"collect-profiles-29553840-xpb52\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.222454 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.231726 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.235720 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slmk9\" (UniqueName: \"kubernetes.io/projected/00d6d506-7c84-4fef-9dc9-85f855533c06-kube-api-access-slmk9\") pod \"packageserver-d55dfcdfc-vll2h\" (UID: \"00d6d506-7c84-4fef-9dc9-85f855533c06\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.244511 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ltv9" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.245647 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58768: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.259204 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtlst\" (UniqueName: \"kubernetes.io/projected/57df17b9-73f2-468a-8359-5a07f19a5493-kube-api-access-rtlst\") pod \"service-ca-operator-777779d784-28g7h\" (UID: \"57df17b9-73f2-468a-8359-5a07f19a5493\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.275668 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9c2804-ee65-4a09-9985-d2345aa7f82a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-t6j7t\" (UID: \"4b9c2804-ee65-4a09-9985-d2345aa7f82a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.276523 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mws5d" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.278584 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58772: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.288857 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgbd4\" (UniqueName: \"kubernetes.io/projected/9a782b5b-9eac-4b5b-8ca8-751111b2459b-kube-api-access-zgbd4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x96fz\" (UID: \"9a782b5b-9eac-4b5b-8ca8-751111b2459b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.299963 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.300881 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.800862338 +0000 UTC m=+111.392126305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.319997 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zj2p\" (UniqueName: \"kubernetes.io/projected/ba5682ea-6a62-4983-b525-5dc9612ad46d-kube-api-access-5zj2p\") pod \"csi-hostpathplugin-bb6wh\" (UID: \"ba5682ea-6a62-4983-b525-5dc9612ad46d\") " pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.324363 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn"] Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.346379 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58788: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.349056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgkcr\" (UniqueName: \"kubernetes.io/projected/680978cb-e609-4292-827f-cc8a5b9c1438-kube-api-access-bgkcr\") pod \"multus-admission-controller-857f4d67dd-zln7t\" (UID: \"680978cb-e609-4292-827f-cc8a5b9c1438\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.369696 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") pod \"marketplace-operator-79b997595-8gcm4\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.381118 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") pod \"cni-sysctl-allowlist-ds-6bx5p\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.402100 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.402433 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:24.902420053 +0000 UTC m=+111.493684020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.405567 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.408501 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89cbz\" (UniqueName: \"kubernetes.io/projected/0acb833f-163a-47e1-8fb7-b9bc97b81fe1-kube-api-access-89cbz\") pod \"service-ca-9c57cc56f-tqt25\" (UID: \"0acb833f-163a-47e1-8fb7-b9bc97b81fe1\") " pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.411320 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.419406 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.438740 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.458713 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58804: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.471717 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.485580 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.489485 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.495733 4816 ???:1] "http: TLS handshake error from 192.168.126.11:58816: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.502310 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.502665 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.503204 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.003182266 +0000 UTC m=+111.594446233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.514815 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.566593 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.581099 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.604563 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.604985 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.104963888 +0000 UTC m=+111.696227885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.667442 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fxsjj"] Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.690800 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54110: no serving certificate available for the kubelet" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.706800 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.707474 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.207449721 +0000 UTC m=+111.798713688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.808550 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.808906 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.308893764 +0000 UTC m=+111.900157731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: W0311 12:00:24.862239 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e0b0c2_39e9_4aa5_934b_01abfe80d224.slice/crio-b8442c962c26f1276e3451ea967fc75e250fb66aedf5b429bd661544990e0297 WatchSource:0}: Error finding container b8442c962c26f1276e3451ea967fc75e250fb66aedf5b429bd661544990e0297: Status 404 returned error can't find the container with id b8442c962c26f1276e3451ea967fc75e250fb66aedf5b429bd661544990e0297 Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.864335 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x5fc4" podStartSLOduration=64.86432045 podStartE2EDuration="1m4.86432045s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:24.863567108 +0000 UTC m=+111.454831075" watchObservedRunningTime="2026-03-11 12:00:24.86432045 +0000 UTC m=+111.455584417" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.865586 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" event={"ID":"da3678d7-b440-44bd-b73b-2b04f1225094","Type":"ContainerStarted","Data":"ab2bf8f3508720b262044c8ff124a051028143a2ec8ee0caa1baa21992774819"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.877950 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mws5d" event={"ID":"fd35e4e1-eb63-44a5-a8e3-376a87c20de2","Type":"ContainerStarted","Data":"9d428a31c3c8922e3578835c8520cb0c46d765311ca84626af2fdff60030e042"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.909051 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.909203 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.409177883 +0000 UTC m=+112.000441850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.909358 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:24 crc kubenswrapper[4816]: E0311 12:00:24.909667 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.409653717 +0000 UTC m=+112.000917684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.930101 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" event={"ID":"f66d48af-027e-448b-9897-9f0c62fbd6c0","Type":"ContainerStarted","Data":"83ce7ed2e7cc7dbd752594375f677dfc68b6916731972c64d0ad92023c6e83ac"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.930172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" event={"ID":"f66d48af-027e-448b-9897-9f0c62fbd6c0","Type":"ContainerStarted","Data":"119048a41fbdab7d4fa34dc3cbc49c3cc316f3d35e003b49f3dc4288bc4b06a9"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.966954 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" podStartSLOduration=64.966941296 podStartE2EDuration="1m4.966941296s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:24.965565957 +0000 UTC m=+111.556829924" watchObservedRunningTime="2026-03-11 12:00:24.966941296 +0000 UTC m=+111.558205263" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.967978 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6m5gg" event={"ID":"027b1711-77a0-4359-bd98-246217fdb5f8","Type":"ContainerStarted","Data":"420ade994713469a8d7dc3a9592eadb5bbdb58134bf2d223edb653287faf03e7"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.977989 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" event={"ID":"546d4851-e1c7-418b-8ba6-5847e5f9efde","Type":"ContainerStarted","Data":"bb301579c908efd9a833ba2c76294edf97abc1c238aa669d3b8696cb61fa9a56"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.993163 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwvvh" podStartSLOduration=64.993148366 podStartE2EDuration="1m4.993148366s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:24.991108938 +0000 UTC m=+111.582372905" watchObservedRunningTime="2026-03-11 12:00:24.993148366 +0000 UTC m=+111.584412333" Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.997740 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" event={"ID":"3af1f0c3-1a92-49f9-beec-dff95561c5dd","Type":"ContainerStarted","Data":"c19a0c4d64f143322e7f9f2a7b2398cffbaaa91830a246a52660246604b33d86"} Mar 11 12:00:24 crc kubenswrapper[4816]: I0311 12:00:24.999054 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" event={"ID":"0503425c-595f-4ff5-a7eb-c73168d939d5","Type":"ContainerStarted","Data":"17d4588ae41a0c39d95e19e5f26dfe7571e9fb623bf49b120095b8c404798267"} Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.000495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" event={"ID":"17c97aa5-8179-41d7-adcb-c4da341f4cec","Type":"ContainerStarted","Data":"f917625c43b93079809067b0f348da470474b0641aee04be57abde0f618f2ab5"} Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.006371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" event={"ID":"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527","Type":"ContainerStarted","Data":"c8662d9af8cd0a9090c14a1f5f335228a3b31c7b8b889472214083ecc6cbeaa4"} Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.006465 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.006533 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.010571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.011742 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.511720327 +0000 UTC m=+112.102984294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.028608 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" podStartSLOduration=65.02858982 podStartE2EDuration="1m5.02858982s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.026730717 +0000 UTC m=+111.617994684" watchObservedRunningTime="2026-03-11 12:00:25.02858982 +0000 UTC m=+111.619853787" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.069552 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dh658" podStartSLOduration=65.069536032 podStartE2EDuration="1m5.069536032s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.066921357 +0000 UTC m=+111.658185334" watchObservedRunningTime="2026-03-11 12:00:25.069536032 +0000 UTC m=+111.660799999" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.112182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.112617 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.612597854 +0000 UTC m=+112.203861821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.216478 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.217433 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.717404883 +0000 UTC m=+112.308668850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.218605 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.218979 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.718964457 +0000 UTC m=+112.310228424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.320598 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.321444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.321927 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.821908193 +0000 UTC m=+112.413172160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.340451 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.389279 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54114: no serving certificate available for the kubelet" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.400106 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gc7hf" podStartSLOduration=65.40008708 podStartE2EDuration="1m5.40008708s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.398509485 +0000 UTC m=+111.989773452" watchObservedRunningTime="2026-03-11 12:00:25.40008708 +0000 UTC m=+111.991351047" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.424934 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.425325 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:25.925312882 +0000 UTC m=+112.516576849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.466044 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ct9ss" podStartSLOduration=65.466025587 podStartE2EDuration="1m5.466025587s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.432183238 +0000 UTC m=+112.023447205" watchObservedRunningTime="2026-03-11 12:00:25.466025587 +0000 UTC m=+112.057289554" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.468419 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.469960 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc"] Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.525786 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.526238 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.026221299 +0000 UTC m=+112.617485266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.628027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.628434 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.128422024 +0000 UTC m=+112.719685991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.719576 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.719538711 podStartE2EDuration="26.719538711s" podCreationTimestamp="2026-03-11 11:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.696969405 +0000 UTC m=+112.288233372" watchObservedRunningTime="2026-03-11 12:00:25.719538711 +0000 UTC m=+112.310802678" Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.729995 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.730649 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.230631908 +0000 UTC m=+112.821895875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.831515 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.831908 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.331890876 +0000 UTC m=+112.923154903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.936939 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:25 crc kubenswrapper[4816]: E0311 12:00:25.937395 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.437380784 +0000 UTC m=+113.028644751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:25 crc kubenswrapper[4816]: I0311 12:00:25.993943 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" podStartSLOduration=65.993928372 podStartE2EDuration="1m5.993928372s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:25.992234304 +0000 UTC m=+112.583498281" watchObservedRunningTime="2026-03-11 12:00:25.993928372 +0000 UTC m=+112.585192339" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.011499 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" event={"ID":"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be","Type":"ContainerStarted","Data":"6f23f4498ef808f215d3f7b697aecea792a62bebef118c14d8004705f7301ea3"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.011556 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" event={"ID":"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be","Type":"ContainerStarted","Data":"76fdb003be2f42f321b1f84a7b3ff2f9c6c39157b9fe6fe69e9cbf0b8e78ef15"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.013949 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6m5gg" event={"ID":"027b1711-77a0-4359-bd98-246217fdb5f8","Type":"ContainerStarted","Data":"fa083920068ccd5688cbd379373974a2b066c0874558fb23ea37e5bac5a67363"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.015282 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" event={"ID":"b7e0b0c2-39e9-4aa5-934b-01abfe80d224","Type":"ContainerStarted","Data":"3bca950243eb3b2a32f38d585904143b6c89f6ff8470498644b40b590c645be8"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.015308 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" event={"ID":"b7e0b0c2-39e9-4aa5-934b-01abfe80d224","Type":"ContainerStarted","Data":"b8442c962c26f1276e3451ea967fc75e250fb66aedf5b429bd661544990e0297"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.016013 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.017145 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" event={"ID":"da3678d7-b440-44bd-b73b-2b04f1225094","Type":"ContainerStarted","Data":"c6bea91cb2e3b32b2163418ca593a45df0ed244d08d3e62ca6ba50f125fb7cb5"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.018470 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" event={"ID":"c0c986ee-b3e9-4bd1-ae9c-7a70b04e1527","Type":"ContainerStarted","Data":"ec44ea2166db26681bb3b2144354ad250006efdac888036b28dd71be6c8b4c11"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.023328 4816 patch_prober.go:28] interesting pod/console-operator-58897d9998-fxsjj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.023382 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" podUID="b7e0b0c2-39e9-4aa5-934b-01abfe80d224" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.024433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" event={"ID":"0503425c-595f-4ff5-a7eb-c73168d939d5","Type":"ContainerStarted","Data":"0542d658e2694def8608b55ddc1d8f7873bd2dbfcd0ac44002f00d970538e265"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.025685 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mzkr9"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.025878 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-blgl4" event={"ID":"efc988f7-8a1a-4d22-b6bb-b2617c721017","Type":"ContainerStarted","Data":"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.025898 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-blgl4" event={"ID":"efc988f7-8a1a-4d22-b6bb-b2617c721017","Type":"ContainerStarted","Data":"59a99708271969fdd60bd64b8768b6f0fa05af801e0f7d034beaae8d3d4be471"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.027002 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" event={"ID":"546d4851-e1c7-418b-8ba6-5847e5f9efde","Type":"ContainerStarted","Data":"94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.031740 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.038065 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.038439 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.538428835 +0000 UTC m=+113.129692802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.059938 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mws5d" event={"ID":"fd35e4e1-eb63-44a5-a8e3-376a87c20de2","Type":"ContainerStarted","Data":"59d2f81c6348a3e946f0356ffa9450b30abc22c2e7958ca8c06ed7e142d914bb"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.062319 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.084825 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" event={"ID":"3af1f0c3-1a92-49f9-beec-dff95561c5dd","Type":"ContainerStarted","Data":"f21741ee3ba7efac50919ebcbe24192aebd758333c328e95af61a037b6dd42f4"} Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.087830 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.087905 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.089462 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.139699 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.159216 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.659195431 +0000 UTC m=+113.250459398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.177327 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.199757 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.191773 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.224656 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.251012 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.256497 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.756483235 +0000 UTC m=+113.347747202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.302715 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.322135 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t5t6b" podStartSLOduration=66.322118363 podStartE2EDuration="1m6.322118363s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.319983022 +0000 UTC m=+112.911246989" watchObservedRunningTime="2026-03-11 12:00:26.322118363 +0000 UTC m=+112.913382330" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.330348 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.358939 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.359424 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.85940845 +0000 UTC m=+113.450672417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: W0311 12:00:26.390582 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c040a86_9614_48cb_9df7_14c83b046dce.slice/crio-dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d WatchSource:0}: Error finding container dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d: Status 404 returned error can't find the container with id dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.402079 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ltv9"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.418562 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" podStartSLOduration=66.418542972 podStartE2EDuration="1m6.418542972s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.417847602 +0000 UTC m=+113.009111569" watchObservedRunningTime="2026-03-11 12:00:26.418542972 +0000 UTC m=+113.009806929" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.460083 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.460476 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:26.960462811 +0000 UTC m=+113.551726778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.461652 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-blgl4" podStartSLOduration=66.461636045 podStartE2EDuration="1m6.461636045s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.459842874 +0000 UTC m=+113.051106851" watchObservedRunningTime="2026-03-11 12:00:26.461636045 +0000 UTC m=+113.052900012" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.509562 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6m5gg" podStartSLOduration=66.509544926 podStartE2EDuration="1m6.509544926s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.507456256 +0000 UTC m=+113.098720223" watchObservedRunningTime="2026-03-11 12:00:26.509544926 +0000 UTC m=+113.100808893" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.510420 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.527334 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.548120 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-28g7h"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.561164 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.561514 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.061500083 +0000 UTC m=+113.652764050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.562913 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.653827 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.655200 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tgbrn" podStartSLOduration=66.655179493 podStartE2EDuration="1m6.655179493s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.584720757 +0000 UTC m=+113.175984724" watchObservedRunningTime="2026-03-11 12:00:26.655179493 +0000 UTC m=+113.246443460" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.665199 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.668287 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.168269918 +0000 UTC m=+113.759533885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.696525 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgxgk"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.723975 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.739533 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zln7t"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.752852 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.754789 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mws5d" podStartSLOduration=5.754770373 podStartE2EDuration="5.754770373s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.752730554 +0000 UTC m=+113.343994521" watchObservedRunningTime="2026-03-11 12:00:26.754770373 +0000 UTC m=+113.346034340" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.767435 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54128: no serving certificate available for the kubelet" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.776115 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.776591 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.276574567 +0000 UTC m=+113.867838534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.827729 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rft5w" podStartSLOduration=66.82771432 podStartE2EDuration="1m6.82771432s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.825640131 +0000 UTC m=+113.416904098" watchObservedRunningTime="2026-03-11 12:00:26.82771432 +0000 UTC m=+113.418978287" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.842420 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.851624 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tqt25"] Mar 11 12:00:26 crc kubenswrapper[4816]: W0311 12:00:26.853034 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680978cb_e609_4292_827f_cc8a5b9c1438.slice/crio-f5975ccfec851804c1a3ba6815d3df83aae82adb150bc99b31c85cc254a88d97 WatchSource:0}: Error finding container f5975ccfec851804c1a3ba6815d3df83aae82adb150bc99b31c85cc254a88d97: Status 404 returned error can't find the container with id f5975ccfec851804c1a3ba6815d3df83aae82adb150bc99b31c85cc254a88d97 Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.853092 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.854973 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.858658 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.870218 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bb6wh"] Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.880401 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.880911 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.380899672 +0000 UTC m=+113.972163639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.919529 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" podStartSLOduration=66.919495236 podStartE2EDuration="1m6.919495236s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:26.912950619 +0000 UTC m=+113.504214586" watchObservedRunningTime="2026-03-11 12:00:26.919495236 +0000 UTC m=+113.510759203" Mar 11 12:00:26 crc kubenswrapper[4816]: I0311 12:00:26.981367 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:26 crc kubenswrapper[4816]: E0311 12:00:26.981866 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.4818467 +0000 UTC m=+114.073110667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:26 crc kubenswrapper[4816]: W0311 12:00:26.981957 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0acb833f_163a_47e1_8fb7_b9bc97b81fe1.slice/crio-e6b2a4078d81ca62b8207d6362521735fd66adf6d36d1b61a71c0e7dcd8d79d5 WatchSource:0}: Error finding container e6b2a4078d81ca62b8207d6362521735fd66adf6d36d1b61a71c0e7dcd8d79d5: Status 404 returned error can't find the container with id e6b2a4078d81ca62b8207d6362521735fd66adf6d36d1b61a71c0e7dcd8d79d5 Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.083539 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.083871 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.583859669 +0000 UTC m=+114.175123636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.090050 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qwqzv" podStartSLOduration=67.090032436 podStartE2EDuration="1m7.090032436s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.089560823 +0000 UTC m=+113.680824790" watchObservedRunningTime="2026-03-11 12:00:27.090032436 +0000 UTC m=+113.681296403" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.103565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" event={"ID":"4b4119d5-f1a1-4d09-83c6-da7decba9ab4","Type":"ContainerStarted","Data":"4716f35279b6c16f2fb81d600a0f46e378f55f57dc8771d2eeab60a0abb74dab"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.104616 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" event={"ID":"0acb833f-163a-47e1-8fb7-b9bc97b81fe1","Type":"ContainerStarted","Data":"e6b2a4078d81ca62b8207d6362521735fd66adf6d36d1b61a71c0e7dcd8d79d5"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.120916 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" event={"ID":"bb999b74-ac20-4e84-b2c7-b16906afbf06","Type":"ContainerStarted","Data":"54da1f4160bc4846c3fcb5e2cae8ad60f73f88af6d98fed0b3efebf2636bdcca"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.120959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" event={"ID":"bb999b74-ac20-4e84-b2c7-b16906afbf06","Type":"ContainerStarted","Data":"db231c364b2be3f1307b4f97cf1b96dd6b5a6a88202b97d4333636332a49c671"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.121701 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" podStartSLOduration=67.121691952 podStartE2EDuration="1m7.121691952s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.121341482 +0000 UTC m=+113.712605449" watchObservedRunningTime="2026-03-11 12:00:27.121691952 +0000 UTC m=+113.712955919" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.128081 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgxgk" event={"ID":"1a17171b-c738-4862-a2a0-cbb09219322a","Type":"ContainerStarted","Data":"496699594abf773dc472d3e220c37a77b73baafcf4f43f773056457b28faa7e5"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.142613 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" event={"ID":"57df17b9-73f2-468a-8359-5a07f19a5493","Type":"ContainerStarted","Data":"16e9a23dc26af901821b3fa1119ea56533e472b0b7cc64924fa1a7c9b408adab"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.147368 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rxljr" podStartSLOduration=67.147351036 podStartE2EDuration="1m7.147351036s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.145609876 +0000 UTC m=+113.736873833" watchObservedRunningTime="2026-03-11 12:00:27.147351036 +0000 UTC m=+113.738615003" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.153286 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" event={"ID":"3c040a86-9614-48cb-9df7-14c83b046dce","Type":"ContainerStarted","Data":"f3bda5d4e49a815a926b2f32c60f3932a76a7181a017078bc20f79926bfbf6a6"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.153338 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" event={"ID":"3c040a86-9614-48cb-9df7-14c83b046dce","Type":"ContainerStarted","Data":"dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.179827 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podStartSLOduration=6.179808745 podStartE2EDuration="6.179808745s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.165450644 +0000 UTC m=+113.756714611" watchObservedRunningTime="2026-03-11 12:00:27.179808745 +0000 UTC m=+113.771072772" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.180388 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:27 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:27 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:27 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.180430 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.192355 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.197427 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.697402768 +0000 UTC m=+114.288666745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.202585 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" event={"ID":"750d6f55-7cf7-4376-8ead-6d481db69c2d","Type":"ContainerStarted","Data":"400513d5fe5aa4eb59bd7bf80a19508c62a58ad61159b9a9c51bf8cb9d9f7bf6"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.217538 4816 generic.go:334] "Generic (PLEG): container finished" podID="2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be" containerID="6f23f4498ef808f215d3f7b697aecea792a62bebef118c14d8004705f7301ea3" exitCode=0 Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.217611 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" event={"ID":"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be","Type":"ContainerDied","Data":"6f23f4498ef808f215d3f7b697aecea792a62bebef118c14d8004705f7301ea3"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.219518 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" event={"ID":"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4","Type":"ContainerStarted","Data":"4a6887167d206555e40cdc1b2ac3119254cfcc32db5af672ba93068c79875718"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.222848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ltv9" event={"ID":"1b74d12c-0a8c-48b1-9931-950ea6e20d4a","Type":"ContainerStarted","Data":"f68fbe554cc10cad015092a6618e898210fbdc88cbd226329328f1aeb3aaa473"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.222881 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ltv9" event={"ID":"1b74d12c-0a8c-48b1-9931-950ea6e20d4a","Type":"ContainerStarted","Data":"bbc911364a81eb26f1c0447ebf813d8828856fa7dd7a0bd30bc6f870ce0705c8"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.224524 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerStarted","Data":"18da590f53c2a68db8ccc3639b30699431b029db82a4def3280157c1b87bba73"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.225582 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" podStartSLOduration=27.225555254 podStartE2EDuration="27.225555254s" podCreationTimestamp="2026-03-11 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.21283774 +0000 UTC m=+113.804101707" watchObservedRunningTime="2026-03-11 12:00:27.225555254 +0000 UTC m=+113.816819221" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.260175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" event={"ID":"680978cb-e609-4292-827f-cc8a5b9c1438","Type":"ContainerStarted","Data":"f5975ccfec851804c1a3ba6815d3df83aae82adb150bc99b31c85cc254a88d97"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.288142 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" event={"ID":"9a782b5b-9eac-4b5b-8ca8-751111b2459b","Type":"ContainerStarted","Data":"9ef7a7bd1419b2085d1576d9a7b67686bdd53f8154c42e3d42c23a1ce00008f3"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.297185 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.298381 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.798370467 +0000 UTC m=+114.389634434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.298623 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" event={"ID":"67dd48ce-6361-442d-9552-f06346e4d8d4","Type":"ContainerStarted","Data":"d53c0f93e8679b10c3ffcfb6e3d167bda109cd94e97318cb36ef5707be22a822"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.301377 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.301421 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" event={"ID":"7aff6a5d-2a66-4ab5-ad53-878f5fea4115","Type":"ContainerStarted","Data":"34b11f77470b1637a3485ad195e041ef69dc18d563a73ae4582797cb787baa83"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.301446 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.335841 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"5536eee87ca3b26c68c1a05aab8edad334cd285012e57c031d6889b3862a5654"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.357499 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.385599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" event={"ID":"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd","Type":"ContainerStarted","Data":"0b5288ecacf11c9a7b38d572188af36741fc424b7fa7a544bd248cb7b4083cfa"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.385723 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" event={"ID":"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd","Type":"ContainerStarted","Data":"c4e5735cb75a4c02b5b1e71d4c701f4a1cb48064e14db1dbd61397d5c1842e47"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.395433 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2ltv9" podStartSLOduration=6.395416474 podStartE2EDuration="6.395416474s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.322432896 +0000 UTC m=+113.913696863" watchObservedRunningTime="2026-03-11 12:00:27.395416474 +0000 UTC m=+113.986680441" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.401103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.401164 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.901148468 +0000 UTC m=+114.492412435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.403368 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.403773 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:27.903755873 +0000 UTC m=+114.495019830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.421530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" event={"ID":"4b9c2804-ee65-4a09-9985-d2345aa7f82a","Type":"ContainerStarted","Data":"311ac96d4d356a7358644c9e2578d6c1fa20ef45622ac37069872540965f4bb7"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.430749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" event={"ID":"00d6d506-7c84-4fef-9dc9-85f855533c06","Type":"ContainerStarted","Data":"1dcb8225761fe22e5903f2fc7e08b1eeb5c154530d31e4174f16177fd54aca84"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.461544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" event={"ID":"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0","Type":"ContainerStarted","Data":"625e9f5d0b2fd0bfae955ffd2e1aa115be2eaf2540a1f99a6a612612eed4193f"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.461594 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" event={"ID":"4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0","Type":"ContainerStarted","Data":"2d4bfe343a06974e898e59211c3213d04af9d8bda390419771b22629d9ef366a"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.462694 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.467022 4816 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-9znd7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.467109 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" podUID="4ea4ac0c-25f8-4dab-ade6-372cd0ec83d0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.512682 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.513374 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.013359129 +0000 UTC m=+114.604623096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.514134 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" event={"ID":"db49f265-44d3-468b-8e2f-2246b02b57be","Type":"ContainerStarted","Data":"9de55b662b2bf4787bdafff13f7749488700cec65a809df12b9b8647839057ea"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.514168 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" event={"ID":"db49f265-44d3-468b-8e2f-2246b02b57be","Type":"ContainerStarted","Data":"214e8fc54c8b54257321732922c5740805d1f1b722e1e4edbb6c4d1803062fd4"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.529806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" event={"ID":"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea","Type":"ContainerStarted","Data":"20fd568f45bd6095124b7be4e165bad17936ef91eff1dd848b5ee022ef1f11e5"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.529851 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" event={"ID":"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea","Type":"ContainerStarted","Data":"da70fe91baa5d29aa10ff0569fbce93a2f818c73f5f48d81b341eb82840a9409"} Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.532491 4816 patch_prober.go:28] interesting pod/console-operator-58897d9998-fxsjj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.532549 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" podUID="b7e0b0c2-39e9-4aa5-934b-01abfe80d224" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.545748 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ksjm4" podStartSLOduration=67.545728205 podStartE2EDuration="1m7.545728205s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.534469813 +0000 UTC m=+114.125733780" watchObservedRunningTime="2026-03-11 12:00:27.545728205 +0000 UTC m=+114.136992172" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.546278 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" podStartSLOduration=67.546272451 podStartE2EDuration="1m7.546272451s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:27.511945809 +0000 UTC m=+114.103209776" watchObservedRunningTime="2026-03-11 12:00:27.546272451 +0000 UTC m=+114.137536418" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.560546 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r2nzn" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.610185 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.610447 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.616361 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.624435 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.124405866 +0000 UTC m=+114.715669913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.722516 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.723176 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.223157942 +0000 UTC m=+114.814421919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.827632 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.827993 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.327973331 +0000 UTC m=+114.919237298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.886819 4816 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pjsgk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]log ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]etcd ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/generic-apiserver-start-informers ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/max-in-flight-filter ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 11 12:00:27 crc kubenswrapper[4816]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 11 12:00:27 crc kubenswrapper[4816]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/project.openshift.io-projectcache ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/openshift.io-startinformers ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 11 12:00:27 crc kubenswrapper[4816]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 11 12:00:27 crc kubenswrapper[4816]: livez check failed Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.887111 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" podUID="3af1f0c3-1a92-49f9-beec-dff95561c5dd" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.922103 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6bx5p"] Mar 11 12:00:27 crc kubenswrapper[4816]: I0311 12:00:27.929040 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:27 crc kubenswrapper[4816]: E0311 12:00:27.929523 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.429490776 +0000 UTC m=+115.020754743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.030214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.030523 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.530510466 +0000 UTC m=+115.121774433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.131224 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.131436 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.631419353 +0000 UTC m=+115.222683320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.131640 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.131965 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.631953748 +0000 UTC m=+115.223217715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.168591 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:28 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:28 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:28 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.168651 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.232434 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.232728 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.732702831 +0000 UTC m=+115.323966798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.232884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.233196 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.733182985 +0000 UTC m=+115.324446952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.333821 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.333976 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.833946778 +0000 UTC m=+115.425210755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.334034 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.334119 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.334399 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.834388171 +0000 UTC m=+115.425652138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.342871 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b59d67-b771-4a57-b2a8-84303ec4d9bd-metrics-certs\") pod \"network-metrics-daemon-tt4rv\" (UID: \"91b59d67-b771-4a57-b2a8-84303ec4d9bd\") " pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.435562 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.435763 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.935738601 +0000 UTC m=+115.527002568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.435888 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.436242 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:28.936229375 +0000 UTC m=+115.527493342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.503902 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tt4rv" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.536837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.537480 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.037461282 +0000 UTC m=+115.628725239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.562267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" event={"ID":"00d6d506-7c84-4fef-9dc9-85f855533c06","Type":"ContainerStarted","Data":"e818fd582cfa1c624d153ed346d8d2561dc858f09a93c614502316139405f7df"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.563524 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.571678 4816 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vll2h container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.571733 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" podUID="00d6d506-7c84-4fef-9dc9-85f855533c06" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.598311 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" podStartSLOduration=68.598298132 podStartE2EDuration="1m8.598298132s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.596643525 +0000 UTC m=+115.187907492" watchObservedRunningTime="2026-03-11 12:00:28.598298132 +0000 UTC m=+115.189562099" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.604174 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" event={"ID":"680978cb-e609-4292-827f-cc8a5b9c1438","Type":"ContainerStarted","Data":"d8e0162229bd3db73cb4b0eeb62e87748a2ab24c3c120f1ce05953a811943f3a"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.604215 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" event={"ID":"680978cb-e609-4292-827f-cc8a5b9c1438","Type":"ContainerStarted","Data":"b0aa7d78792eb323f1a29568eaf840ba018bb032b7729b69eab01f3e53bcb74f"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.625185 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zln7t" podStartSLOduration=68.625170451 podStartE2EDuration="1m8.625170451s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.623481613 +0000 UTC m=+115.214745580" watchObservedRunningTime="2026-03-11 12:00:28.625170451 +0000 UTC m=+115.216434418" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.625707 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" event={"ID":"4b4119d5-f1a1-4d09-83c6-da7decba9ab4","Type":"ContainerStarted","Data":"addfd347014ed52df7794035d7b9a4debe54f61c6d6f028766184333e5e1dc2c"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.625749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" event={"ID":"4b4119d5-f1a1-4d09-83c6-da7decba9ab4","Type":"ContainerStarted","Data":"f32ffe01e1a0fcc6604baceed20d820272e5a68932e345d2492cce4a42a677b4"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.640074 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.642332 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.142319322 +0000 UTC m=+115.733583289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.646982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" event={"ID":"2eaac3e7-6f80-47da-a6c7-e415a0b8edbd","Type":"ContainerStarted","Data":"4f8b0832076daae74be53cca6035926e435c7c3a7f318de1ef6e3199667b54b2"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.658824 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gm7t5" podStartSLOduration=68.658809234 podStartE2EDuration="1m8.658809234s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.657000212 +0000 UTC m=+115.248264179" watchObservedRunningTime="2026-03-11 12:00:28.658809234 +0000 UTC m=+115.250073201" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.671106 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" event={"ID":"36999e5d-2e84-4f16-8c9f-4a2c40a34cd4","Type":"ContainerStarted","Data":"7ce3d816fcccc907bf29f351448d8897be42efa04eaa3896a9e32c00c6103b6f"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.693943 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" event={"ID":"0acb833f-163a-47e1-8fb7-b9bc97b81fe1","Type":"ContainerStarted","Data":"5b923cb3ffe100c411947fc416ed4c455c8878af0d16f2650fa92108bbb1f053"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.696079 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerStarted","Data":"8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.696721 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.697783 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" event={"ID":"67dd48ce-6361-442d-9552-f06346e4d8d4","Type":"ContainerStarted","Data":"fd6b49190e602fec1028c6aa848dd26559f95b874b7a6531d2fd8b5cd2571187"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.699316 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" event={"ID":"7aff6a5d-2a66-4ab5-ad53-878f5fea4115","Type":"ContainerStarted","Data":"0bede3b68dc0f0441392fc57baf54d9c3af3b0d7760a2f7b93b4cb6aeab6c1ef"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.699348 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" event={"ID":"7aff6a5d-2a66-4ab5-ad53-878f5fea4115","Type":"ContainerStarted","Data":"a8d8db370a9b41f32c69ce59576c35b65b768e3a327b6f66e777f1f0e6393cc9"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.699795 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.701365 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" event={"ID":"74bff80f-f1ae-408d-b6e2-0bdca1c5c0ea","Type":"ContainerStarted","Data":"7db71036cde4280388b996ebdddbfc73d6c04f34960ba91294120ca7c453323a"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.714757 4816 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8gcm4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.714813 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.722142 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k74wh" podStartSLOduration=68.722126306 podStartE2EDuration="1m8.722126306s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.720654934 +0000 UTC m=+115.311918901" watchObservedRunningTime="2026-03-11 12:00:28.722126306 +0000 UTC m=+115.313390273" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.723281 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4kd2n" podStartSLOduration=68.723275149 podStartE2EDuration="1m8.723275149s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.694710491 +0000 UTC m=+115.285974458" watchObservedRunningTime="2026-03-11 12:00:28.723275149 +0000 UTC m=+115.314539116" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.734314 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" event={"ID":"bb999b74-ac20-4e84-b2c7-b16906afbf06","Type":"ContainerStarted","Data":"854754aef75c5dd4fda6c4b5f198214b9610019973f5ebb2ae801d50d4cf7929"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.741393 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.741632 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.241604173 +0000 UTC m=+115.832868130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.741725 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.743201 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.243185808 +0000 UTC m=+115.834449865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.747303 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" event={"ID":"57df17b9-73f2-468a-8359-5a07f19a5493","Type":"ContainerStarted","Data":"0475dc3f14cdd309eba8548944ac45b9a037b6703c4f0cfe3f54656702d6fdde"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.754576 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgxgk" event={"ID":"1a17171b-c738-4862-a2a0-cbb09219322a","Type":"ContainerStarted","Data":"cb948f85958844a1fa9a2001754abbfab8ccf9b60a31973e68c3b5136fc4cf7f"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.754636 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgxgk" event={"ID":"1a17171b-c738-4862-a2a0-cbb09219322a","Type":"ContainerStarted","Data":"9690d0e3275c28410c9a526af6e05325e0184ce4b4f91da89997623600412eaf"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.755292 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.756851 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-tqt25" podStartSLOduration=68.756834109 podStartE2EDuration="1m8.756834109s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.754183733 +0000 UTC m=+115.345447700" watchObservedRunningTime="2026-03-11 12:00:28.756834109 +0000 UTC m=+115.348098076" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.762779 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" event={"ID":"750d6f55-7cf7-4376-8ead-6d481db69c2d","Type":"ContainerStarted","Data":"8a2503a1dd8cbc73f797464d5558e356105fc109fc34860490050a8dca4e4e5e"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.763829 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.774723 4816 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-256s6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.774782 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" podUID="750d6f55-7cf7-4376-8ead-6d481db69c2d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.775788 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" event={"ID":"9a782b5b-9eac-4b5b-8ca8-751111b2459b","Type":"ContainerStarted","Data":"e23b0d7be7a104e344a22c0c1f176758057c7f2b7edb7f10d52760db7c90e4e7"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.797355 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" event={"ID":"4b9c2804-ee65-4a09-9985-d2345aa7f82a","Type":"ContainerStarted","Data":"5fbf9043b40ffe58902cbb5afea9100285e835fb24b1122454ed3494446e1a5d"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.800838 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ksn6f" podStartSLOduration=68.800799907 podStartE2EDuration="1m8.800799907s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.778855099 +0000 UTC m=+115.370119066" watchObservedRunningTime="2026-03-11 12:00:28.800799907 +0000 UTC m=+115.392063874" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.801278 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mzkr9" podStartSLOduration=68.80127205 podStartE2EDuration="1m8.80127205s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.800456797 +0000 UTC m=+115.391720754" watchObservedRunningTime="2026-03-11 12:00:28.80127205 +0000 UTC m=+115.392536017" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.812550 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" event={"ID":"2517fd5f-6a0b-4ab4-991d-41ac3c9bd0be","Type":"ContainerStarted","Data":"c6d542e698cd1c920cc240a3c13d380658d0156d62a503c40e848230eda9b6e3"} Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.812725 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.824412 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdrwx" podStartSLOduration=68.824395882 podStartE2EDuration="1m8.824395882s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.823718313 +0000 UTC m=+115.414982280" watchObservedRunningTime="2026-03-11 12:00:28.824395882 +0000 UTC m=+115.415659849" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.825585 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fxsjj" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.840144 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9znd7" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.844300 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.844457 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.344439665 +0000 UTC m=+115.935703632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.844602 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.847060 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.34705034 +0000 UTC m=+115.938314307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.869408 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-28g7h" podStartSLOduration=68.869376309 podStartE2EDuration="1m8.869376309s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.860544056 +0000 UTC m=+115.451808013" watchObservedRunningTime="2026-03-11 12:00:28.869376309 +0000 UTC m=+115.460640276" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.889800 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" podStartSLOduration=68.889783703 podStartE2EDuration="1m8.889783703s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.888314671 +0000 UTC m=+115.479578638" watchObservedRunningTime="2026-03-11 12:00:28.889783703 +0000 UTC m=+115.481047670" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.922016 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" podStartSLOduration=68.921999335 podStartE2EDuration="1m8.921999335s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.920829711 +0000 UTC m=+115.512093678" watchObservedRunningTime="2026-03-11 12:00:28.921999335 +0000 UTC m=+115.513263302" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.945775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.945996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.946059 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.946146 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.946347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:28 crc kubenswrapper[4816]: E0311 12:00:28.947124 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.447110683 +0000 UTC m=+116.038374650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.957352 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.959125 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.964916 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.976534 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.986327 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.986576 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" containerID="cri-o://35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" gracePeriod=30 Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.988824 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" podStartSLOduration=68.988803266 podStartE2EDuration="1m8.988803266s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:28.981322012 +0000 UTC m=+115.572585979" watchObservedRunningTime="2026-03-11 12:00:28.988803266 +0000 UTC m=+115.580067233" Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.993837 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:28 crc kubenswrapper[4816]: I0311 12:00:28.994015 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" containerID="cri-o://067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" gracePeriod=30 Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.047971 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.048424 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.548408112 +0000 UTC m=+116.139672079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.109663 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tt4rv"] Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.135495 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.141716 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.148876 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.149202 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.649184965 +0000 UTC m=+116.240448932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.158184 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.170418 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:29 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:29 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:29 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.170472 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:29 crc kubenswrapper[4816]: W0311 12:00:29.170687 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b59d67_b771_4a57_b2a8_84303ec4d9bd.slice/crio-1fa2bca17353aa45b616c5d370c66b2256c1be892bee9030feaf0182219847a5 WatchSource:0}: Error finding container 1fa2bca17353aa45b616c5d370c66b2256c1be892bee9030feaf0182219847a5: Status 404 returned error can't find the container with id 1fa2bca17353aa45b616c5d370c66b2256c1be892bee9030feaf0182219847a5 Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.178720 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wgxgk" podStartSLOduration=8.1787037 podStartE2EDuration="8.1787037s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:29.145773898 +0000 UTC m=+115.737037865" watchObservedRunningTime="2026-03-11 12:00:29.1787037 +0000 UTC m=+115.769967657" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.178845 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-t6j7t" podStartSLOduration=69.178842794 podStartE2EDuration="1m9.178842794s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:29.173553223 +0000 UTC m=+115.764817180" watchObservedRunningTime="2026-03-11 12:00:29.178842794 +0000 UTC m=+115.770106761" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.225020 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x96fz" podStartSLOduration=69.225000665 podStartE2EDuration="1m9.225000665s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:29.221737661 +0000 UTC m=+115.813001629" watchObservedRunningTime="2026-03-11 12:00:29.225000665 +0000 UTC m=+115.816264632" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.259154 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.259467 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.759456081 +0000 UTC m=+116.350720048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.260631 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" podStartSLOduration=69.260616604 podStartE2EDuration="1m9.260616604s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:29.258122943 +0000 UTC m=+115.849386910" watchObservedRunningTime="2026-03-11 12:00:29.260616604 +0000 UTC m=+115.851880571" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.362512 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.362868 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.862854199 +0000 UTC m=+116.454118166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.429501 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54132: no serving certificate available for the kubelet" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.463967 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.464376 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:29.964364514 +0000 UTC m=+116.555628481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.568547 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.569137 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.069123272 +0000 UTC m=+116.660387239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.671008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.671309 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.171297775 +0000 UTC m=+116.762561742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.709056 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:29 crc kubenswrapper[4816]: W0311 12:00:29.767169 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c7c215c47e2ebf54d6c6dd174e46aaf1963e96116392bf7a714f25e370ba78ce WatchSource:0}: Error finding container c7c215c47e2ebf54d6c6dd174e46aaf1963e96116392bf7a714f25e370ba78ce: Status 404 returned error can't find the container with id c7c215c47e2ebf54d6c6dd174e46aaf1963e96116392bf7a714f25e370ba78ce Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771456 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771561 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") pod \"ef1d29fc-f278-4f20-8362-3c406634d8ff\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771604 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") pod \"ef1d29fc-f278-4f20-8362-3c406634d8ff\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771632 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") pod \"ef1d29fc-f278-4f20-8362-3c406634d8ff\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.771685 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") pod \"ef1d29fc-f278-4f20-8362-3c406634d8ff\" (UID: \"ef1d29fc-f278-4f20-8362-3c406634d8ff\") " Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.774722 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.274695444 +0000 UTC m=+116.865959411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.779053 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config" (OuterVolumeSpecName: "config") pod "ef1d29fc-f278-4f20-8362-3c406634d8ff" (UID: "ef1d29fc-f278-4f20-8362-3c406634d8ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.779728 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.781018 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef1d29fc-f278-4f20-8362-3c406634d8ff" (UID: "ef1d29fc-f278-4f20-8362-3c406634d8ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.804795 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef1d29fc-f278-4f20-8362-3c406634d8ff" (UID: "ef1d29fc-f278-4f20-8362-3c406634d8ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.806046 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg" (OuterVolumeSpecName: "kube-api-access-fzblg") pod "ef1d29fc-f278-4f20-8362-3c406634d8ff" (UID: "ef1d29fc-f278-4f20-8362-3c406634d8ff"). InnerVolumeSpecName "kube-api-access-fzblg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.872851 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873162 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873206 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873266 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873326 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") pod \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\" (UID: \"564c2921-e9eb-4a24-a5b7-1a8471d1586b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873637 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873692 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873707 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzblg\" (UniqueName: \"kubernetes.io/projected/ef1d29fc-f278-4f20-8362-3c406634d8ff-kube-api-access-fzblg\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873720 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1d29fc-f278-4f20-8362-3c406634d8ff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.873733 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef1d29fc-f278-4f20-8362-3c406634d8ff-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.874000 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.373986625 +0000 UTC m=+116.965250592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.874310 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config" (OuterVolumeSpecName: "config") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.874715 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tt4rv" event={"ID":"91b59d67-b771-4a57-b2a8-84303ec4d9bd","Type":"ContainerStarted","Data":"6040a1d894051c4055c719c0a52de1f81cd096adc443a2192b997078f04868fa"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.874755 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tt4rv" event={"ID":"91b59d67-b771-4a57-b2a8-84303ec4d9bd","Type":"ContainerStarted","Data":"1fa2bca17353aa45b616c5d370c66b2256c1be892bee9030feaf0182219847a5"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.875416 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.882650 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca" (OuterVolumeSpecName: "client-ca") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.893884 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.897441 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq" (OuterVolumeSpecName: "kube-api-access-xw5bq") pod "564c2921-e9eb-4a24-a5b7-1a8471d1586b" (UID: "564c2921-e9eb-4a24-a5b7-1a8471d1586b"). InnerVolumeSpecName "kube-api-access-xw5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904486 4816 generic.go:334] "Generic (PLEG): container finished" podID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerID="35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" exitCode=0 Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904573 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" event={"ID":"564c2921-e9eb-4a24-a5b7-1a8471d1586b","Type":"ContainerDied","Data":"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" event={"ID":"564c2921-e9eb-4a24-a5b7-1a8471d1586b","Type":"ContainerDied","Data":"306382581adac0ac9b7eb96a682fee969c6c0324fd34514acd435886ca5bcb46"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904621 4816 scope.go:117] "RemoveContainer" containerID="35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.904732 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nv429" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.950526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c7c215c47e2ebf54d6c6dd174e46aaf1963e96116392bf7a714f25e370ba78ce"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.971406 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"4b1b82ea95db10b44cfdd3575432186e33d5528e7acf08d32f9607876280b08f"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975265 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975537 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975548 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975571 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/564c2921-e9eb-4a24-a5b7-1a8471d1586b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975580 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564c2921-e9eb-4a24-a5b7-1a8471d1586b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.975589 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw5bq\" (UniqueName: \"kubernetes.io/projected/564c2921-e9eb-4a24-a5b7-1a8471d1586b-kube-api-access-xw5bq\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.975662 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.475631813 +0000 UTC m=+117.066895780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.980199 4816 scope.go:117] "RemoveContainer" containerID="35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.992770 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:29 crc kubenswrapper[4816]: E0311 12:00:29.993005 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b\": container with ID starting with 35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b not found: ID does not exist" containerID="35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.993045 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b"} err="failed to get container status \"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b\": rpc error: code = NotFound desc = could not find container \"35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b\": container with ID starting with 35536e0f12f0b360de404d447220e629a214cf40c465fa086e81ea108295ac6b not found: ID does not exist" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.993204 4816 generic.go:334] "Generic (PLEG): container finished" podID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerID="067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" exitCode=0 Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995080 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995099 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" event={"ID":"ef1d29fc-f278-4f20-8362-3c406634d8ff","Type":"ContainerDied","Data":"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995124 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nv429"] Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995140 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr" event={"ID":"ef1d29fc-f278-4f20-8362-3c406634d8ff","Type":"ContainerDied","Data":"095fa56b3beb4f734f86a3746d97623146bfffe930c63a78d60c59c578ed0242"} Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.995154 4816 scope.go:117] "RemoveContainer" containerID="067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" Mar 11 12:00:29 crc kubenswrapper[4816]: I0311 12:00:29.998150 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" gracePeriod=30 Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.014066 4816 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8gcm4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.014115 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.017393 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-256s6" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.070380 4816 scope.go:117] "RemoveContainer" containerID="067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.083544 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.085679 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.585664272 +0000 UTC m=+117.176928239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.108046 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.108616 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1\": container with ID starting with 067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1 not found: ID does not exist" containerID="067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.108660 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1"} err="failed to get container status \"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1\": rpc error: code = NotFound desc = could not find container \"067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1\": container with ID starting with 067208dd2d05a8f631081581262fd02e620d3152bca9ba1e74aa403cc3cbbfd1 not found: ID does not exist" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.110310 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cdscr"] Mar 11 12:00:30 crc kubenswrapper[4816]: W0311 12:00:30.131376 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-2b95c9ae7fa4598a347a3a69f4fa35abe4a733441a2b058dc15b0ea079136ebd WatchSource:0}: Error finding container 2b95c9ae7fa4598a347a3a69f4fa35abe4a733441a2b058dc15b0ea079136ebd: Status 404 returned error can't find the container with id 2b95c9ae7fa4598a347a3a69f4fa35abe4a733441a2b058dc15b0ea079136ebd Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.181056 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:30 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:30 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:30 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.181099 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.181802 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" path="/var/lib/kubelet/pods/564c2921-e9eb-4a24-a5b7-1a8471d1586b/volumes" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.182356 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" path="/var/lib/kubelet/pods/ef1d29fc-f278-4f20-8362-3c406634d8ff/volumes" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.185375 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.185660 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.685634312 +0000 UTC m=+117.276898279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.185968 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.186488 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.686472606 +0000 UTC m=+117.277736573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.287609 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.287775 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.787748454 +0000 UTC m=+117.379012421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.288147 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.288563 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.788545307 +0000 UTC m=+117.379809274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.388728 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.388910 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.888884938 +0000 UTC m=+117.480148905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.389949 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.390293 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.890284668 +0000 UTC m=+117.481548635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.425752 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vll2h" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.495276 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.495501 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.995470398 +0000 UTC m=+117.586734375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.496037 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.496472 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:30.996463176 +0000 UTC m=+117.587727143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.597584 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.597685 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.097670132 +0000 UTC m=+117.688934099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.597847 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.598116 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.098107525 +0000 UTC m=+117.689371492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642302 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.642549 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642569 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.642591 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642600 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642717 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="564c2921-e9eb-4a24-a5b7-1a8471d1586b" containerName="controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.642741 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1d29fc-f278-4f20-8362-3c406634d8ff" containerName="route-controller-manager" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.643583 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.646181 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.694433 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.699013 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.699214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.699275 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.699321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.699418 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.199402993 +0000 UTC m=+117.790666960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.800940 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801071 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801114 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.801469 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.301454893 +0000 UTC m=+117.892718860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801543 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.801622 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.838838 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") pod \"certified-operators-9fv28\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.844604 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.845746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.846993 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.854532 4816 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.860862 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.901684 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.901837 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.901920 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.901961 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:30 crc kubenswrapper[4816]: E0311 12:00:30.902064 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.402049322 +0000 UTC m=+117.993313279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:30 crc kubenswrapper[4816]: I0311 12:00:30.960546 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.003078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.003147 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.003195 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.003225 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: E0311 12:00:31.003523 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.503511705 +0000 UTC m=+118.094775672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.004014 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.004211 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.009907 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tt4rv" event={"ID":"91b59d67-b771-4a57-b2a8-84303ec4d9bd","Type":"ContainerStarted","Data":"108155b2e0d568f79222b0c35c65ac8628e7eff006cbd9a71937c52d317b6c79"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.011556 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f536214d2d7e45a70f2551a23896f1f27009e74b06c936b5f1274110830510f7"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.011584 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"93b0d41aa9711271966eea402d33238cc42a3c444f85d7a083746c26838ae715"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.019340 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"275927a2ce16db15a1f7379ebd602e23fb3f5b46bb7a7ad8b9739ad525d8b6c5"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.019486 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.022146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"c499589a765355dee4120b42d30d815f1b0331b591fde949ecf5e9b984eb905f"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.022172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"7d45b1c0f4524e501322e6b17e727c0a896418327d0a26e8846d2bf9ac2ae2c7"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.026242 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"967c9268c14141b6c3f2c9f2dc4498d9ae6f96d221f70dbf1c7dc1457f590425"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.026322 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2b95c9ae7fa4598a347a3a69f4fa35abe4a733441a2b058dc15b0ea079136ebd"} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.027657 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tt4rv" podStartSLOduration=71.027642685 podStartE2EDuration="1m11.027642685s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:31.025963567 +0000 UTC m=+117.617227534" watchObservedRunningTime="2026-03-11 12:00:31.027642685 +0000 UTC m=+117.618906662" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.036269 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.044290 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.044935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") pod \"community-operators-jwq6f\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.045228 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.056471 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.057267 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.057536 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.058118 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.064013 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.066719 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.066750 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.066896 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.066934 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.068424 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.069650 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.069756 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.069878 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.070065 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.070201 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.074563 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.078528 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.082867 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.092072 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.102872 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.106972 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107276 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107381 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107400 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107420 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107434 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107461 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107552 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.107586 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: E0311 12:00:31.108355 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.608341245 +0000 UTC m=+118.199605212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.154952 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.172632 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:31 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:31 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:31 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.172707 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.183095 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209492 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209554 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209572 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209598 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209631 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209647 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209682 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209709 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209740 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.209769 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.210533 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.211232 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.211481 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: E0311 12:00:31.211755 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 12:00:31.711742353 +0000 UTC m=+118.303006320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p426k" (UID: "9a7e3709-d407-4679-add6-375a835421be") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.212113 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.230891 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.234538 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.235169 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.240609 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.248138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") pod \"controller-manager-7f7578748c-p527z\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.252797 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.253946 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") pod \"route-controller-manager-6fb4858c9f-v88nx\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.263111 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") pod \"certified-operators-s2dh2\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.269873 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.271658 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.276375 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.292549 4816 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-11T12:00:30.854557873Z","Handler":null,"Name":""} Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.299833 4816 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.300034 4816 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.311048 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.325852 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.326091 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.326133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.326148 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.351028 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.351003578 podStartE2EDuration="351.003578ms" podCreationTimestamp="2026-03-11 12:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:31.3332348 +0000 UTC m=+117.924498767" watchObservedRunningTime="2026-03-11 12:00:31.351003578 +0000 UTC m=+117.942267545" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.363039 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.378582 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.385511 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.399297 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.427681 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428089 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428648 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.428740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.434125 4816 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.434168 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.451795 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") pod \"community-operators-ndrbx\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.458274 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p426k\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.494647 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.586355 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.648036 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.837848 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:00:31 crc kubenswrapper[4816]: W0311 12:00:31.845455 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d5c9149_6a85_4e50_9569_6cc828e55a11.slice/crio-93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f WatchSource:0}: Error finding container 93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f: Status 404 returned error can't find the container with id 93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.864802 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:00:31 crc kubenswrapper[4816]: W0311 12:00:31.877052 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod756dd25b_5375_48bc_8578_a9585ef49e6c.slice/crio-6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec WatchSource:0}: Error finding container 6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec: Status 404 returned error can't find the container with id 6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.910308 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:31 crc kubenswrapper[4816]: I0311 12:00:31.960657 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.079719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerStarted","Data":"6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.103654 4816 generic.go:334] "Generic (PLEG): container finished" podID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerID="3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd" exitCode=0 Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.103750 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerDied","Data":"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.103780 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerStarted","Data":"499f7962c1697f289517091d9831d7c624088927518036ee83a281ffd5b62905"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.110713 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.113733 4816 generic.go:334] "Generic (PLEG): container finished" podID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerID="1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb" exitCode=0 Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.113815 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerDied","Data":"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.113842 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerStarted","Data":"e00a61b1b339e0c135f2f8629c96ed94976ec15fddfa98352c7a50768117327d"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.122282 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" event={"ID":"9a7e3709-d407-4679-add6-375a835421be","Type":"ContainerStarted","Data":"ec23157cec86a7144fad1cf7ce6f1de12230714b1e857a2199a9972f099db0a1"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.139442 4816 patch_prober.go:28] interesting pod/route-controller-manager-6fb4858c9f-v88nx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.139517 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.157628 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.158726 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" event={"ID":"1d5c9149-6a85-4e50-9569-6cc828e55a11","Type":"ContainerStarted","Data":"93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.158783 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.158798 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" event={"ID":"c904faa8-338a-4f9c-80fc-bad9d60139a0","Type":"ContainerStarted","Data":"868efa371ca880139b71d36617be11ed32ba7747cf0e6a8180c51bf10cbc179c"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.161410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" event={"ID":"ba5682ea-6a62-4983-b525-5dc9612ad46d","Type":"ContainerStarted","Data":"96b041a36f6aff46055c454f884cc9dfdcf1e7340fff69a4ab5d8c17f750bc64"} Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.161474 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.183916 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:32 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:32 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:32 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.183969 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.184750 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.184731483 podStartE2EDuration="184.731483ms" podCreationTimestamp="2026-03-11 12:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:32.181847221 +0000 UTC m=+118.773111188" watchObservedRunningTime="2026-03-11 12:00:32.184731483 +0000 UTC m=+118.775995450" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.200972 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podStartSLOduration=3.200956718 podStartE2EDuration="3.200956718s" podCreationTimestamp="2026-03-11 12:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:32.200713161 +0000 UTC m=+118.791977118" watchObservedRunningTime="2026-03-11 12:00:32.200956718 +0000 UTC m=+118.792220685" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.219412 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.510915 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.510982 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.511410 4816 patch_prober.go:28] interesting pod/downloads-7954f5f757-dh658 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.511459 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dh658" podUID="8c843417-3e01-48f9-b0b6-845fbbbf7eab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.605296 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.609987 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pjsgk" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.631290 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bb6wh" podStartSLOduration=11.63127035 podStartE2EDuration="11.63127035s" podCreationTimestamp="2026-03-11 12:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:32.260554723 +0000 UTC m=+118.851818690" watchObservedRunningTime="2026-03-11 12:00:32.63127035 +0000 UTC m=+119.222534317" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.653544 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.654868 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.657282 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.674062 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.751608 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.751966 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.752010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853102 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853599 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853846 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.853951 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.885267 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") pod \"redhat-marketplace-rlvrz\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:32 crc kubenswrapper[4816]: I0311 12:00:32.970829 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.035835 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6n4qc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.041714 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.042703 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.060267 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.164179 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.165962 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.166060 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.166858 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:33 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:33 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:33 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.166893 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.204713 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" event={"ID":"9a7e3709-d407-4679-add6-375a835421be","Type":"ContainerStarted","Data":"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.204946 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.210854 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" event={"ID":"1d5c9149-6a85-4e50-9569-6cc828e55a11","Type":"ContainerStarted","Data":"25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.214315 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" event={"ID":"c904faa8-338a-4f9c-80fc-bad9d60139a0","Type":"ContainerStarted","Data":"b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.214512 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.224485 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.224883 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.230842 4816 generic.go:334] "Generic (PLEG): container finished" podID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerID="c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac" exitCode=0 Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.231518 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerDied","Data":"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.238051 4816 generic.go:334] "Generic (PLEG): container finished" podID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerID="db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80" exitCode=0 Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.238862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerDied","Data":"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.239591 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerStarted","Data":"496964d22446ecfb6c504cae509de586a0cc99c038e5375b83e1db6c09ad3706"} Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.240347 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" podStartSLOduration=73.240314048 podStartE2EDuration="1m13.240314048s" podCreationTimestamp="2026-03-11 11:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:33.239991688 +0000 UTC m=+119.831255665" watchObservedRunningTime="2026-03-11 12:00:33.240314048 +0000 UTC m=+119.831578015" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.268178 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.268234 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.268286 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.272468 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" podStartSLOduration=4.272447447 podStartE2EDuration="4.272447447s" podCreationTimestamp="2026-03-11 12:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:33.270834401 +0000 UTC m=+119.862098388" watchObservedRunningTime="2026-03-11 12:00:33.272447447 +0000 UTC m=+119.863711414" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.293538 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.293970 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.328865 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.347149 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") pod \"redhat-marketplace-cg4jl\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.367488 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.599039 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.658419 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.702441 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.703609 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.707027 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.708542 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.709382 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.778303 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.778383 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.849811 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.851118 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.852855 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.853191 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879905 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879944 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879962 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.879989 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.880390 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.902545 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.981079 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.981123 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.981153 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.981915 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.983138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:33 crc kubenswrapper[4816]: I0311 12:00:33.997611 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") pod \"redhat-operators-jtm2c\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.038194 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.046627 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.049798 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.049823 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.051678 4816 patch_prober.go:28] interesting pod/console-f9d7485db-blgl4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.051728 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-blgl4" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.167339 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.188540 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:34 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:34 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:34 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.188584 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.192880 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.201540 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.246793 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.248158 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.258001 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.285492 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.285570 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.285666 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.291523 4816 generic.go:334] "Generic (PLEG): container finished" podID="e94af1b5-09ef-433f-91e6-7b352836273d" containerID="d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc" exitCode=0 Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.292911 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerDied","Data":"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc"} Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.292952 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerStarted","Data":"a8cafecc50e94d07fe579d21307c54f31a39be731f47fedd9b733a84b5d89387"} Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.338476 4816 generic.go:334] "Generic (PLEG): container finished" podID="34f226df-3352-4423-822c-67891ad3a398" containerID="8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384" exitCode=0 Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.339537 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerDied","Data":"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384"} Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.339560 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerStarted","Data":"d0c2bd9596db386896db7ff2b9f9f3f47d22ce171a97edaa6f0b88c2da2cae3b"} Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.389999 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.390054 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.390203 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.390732 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.398487 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.432034 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") pod \"redhat-operators-8r7jt\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.448795 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.450026 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.456202 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.456320 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.465093 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.491872 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.492229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.569816 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.580674 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.593937 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.594015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.594150 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.604212 4816 ???:1] "http: TLS handshake error from 192.168.126.11:54146: no serving certificate available for the kubelet" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.615677 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: E0311 12:00:34.616404 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:34 crc kubenswrapper[4816]: E0311 12:00:34.617970 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:34 crc kubenswrapper[4816]: E0311 12:00:34.632073 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:34 crc kubenswrapper[4816]: E0311 12:00:34.632145 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.791353 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:34 crc kubenswrapper[4816]: I0311 12:00:34.818267 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:00:34 crc kubenswrapper[4816]: W0311 12:00:34.883909 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce281163_d6c0_444b_ba55_b488dd77b853.slice/crio-7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0 WatchSource:0}: Error finding container 7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0: Status 404 returned error can't find the container with id 7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0 Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.057234 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.079680 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 12:00:35 crc kubenswrapper[4816]: W0311 12:00:35.090815 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod729acc42_ae45_498b_8b45_a0307fa7951e.slice/crio-3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195 WatchSource:0}: Error finding container 3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195: Status 404 returned error can't find the container with id 3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195 Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.167668 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:35 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:35 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:35 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.167938 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.357329 4816 generic.go:334] "Generic (PLEG): container finished" podID="3c040a86-9614-48cb-9df7-14c83b046dce" containerID="f3bda5d4e49a815a926b2f32c60f3932a76a7181a017078bc20f79926bfbf6a6" exitCode=0 Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.357454 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" event={"ID":"3c040a86-9614-48cb-9df7-14c83b046dce","Type":"ContainerDied","Data":"f3bda5d4e49a815a926b2f32c60f3932a76a7181a017078bc20f79926bfbf6a6"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.360345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"729acc42-ae45-498b-8b45-a0307fa7951e","Type":"ContainerStarted","Data":"3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.367289 4816 generic.go:334] "Generic (PLEG): container finished" podID="ce281163-d6c0-444b-ba55-b488dd77b853" containerID="3ab1f4b901f51b92d05dc18c4be8f53411d27fe11dfae52c40ff6b519e7e0cea" exitCode=0 Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.367358 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerDied","Data":"3ab1f4b901f51b92d05dc18c4be8f53411d27fe11dfae52c40ff6b519e7e0cea"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.367389 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerStarted","Data":"7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.401445 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d26fa831-2257-478d-a4dd-9b33c6a59198","Type":"ContainerStarted","Data":"58707dc69b9de00c9ab7464906274475a3e993f4c7902adf0157e977c06dc9c3"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.401530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d26fa831-2257-478d-a4dd-9b33c6a59198","Type":"ContainerStarted","Data":"3cc4bff0538c91690233bfb9bb26cddd773478266affdc0cbf69df64ec4f1cf1"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.406509 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerStarted","Data":"955eae43fa10fa99fdb4e5d4b56b2dccf5fef672fa575257d946cf0938c80e99"} Mar 11 12:00:35 crc kubenswrapper[4816]: I0311 12:00:35.621888 4816 ???:1] "http: TLS handshake error from 192.168.126.11:53740: no serving certificate available for the kubelet" Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.165984 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:36 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:36 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:36 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.166328 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.436913 4816 generic.go:334] "Generic (PLEG): container finished" podID="d26fa831-2257-478d-a4dd-9b33c6a59198" containerID="58707dc69b9de00c9ab7464906274475a3e993f4c7902adf0157e977c06dc9c3" exitCode=0 Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.437023 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d26fa831-2257-478d-a4dd-9b33c6a59198","Type":"ContainerDied","Data":"58707dc69b9de00c9ab7464906274475a3e993f4c7902adf0157e977c06dc9c3"} Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.444837 4816 generic.go:334] "Generic (PLEG): container finished" podID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerID="380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757" exitCode=0 Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.444954 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerDied","Data":"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757"} Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.456566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"729acc42-ae45-498b-8b45-a0307fa7951e","Type":"ContainerStarted","Data":"2e0d42da96a573093e348b66293d58e2cc0ebf0eda6f03cabc520124bd4d6901"} Mar 11 12:00:36 crc kubenswrapper[4816]: I0311 12:00:36.483871 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.483850067 podStartE2EDuration="2.483850067s" podCreationTimestamp="2026-03-11 12:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:00:36.477130345 +0000 UTC m=+123.068394312" watchObservedRunningTime="2026-03-11 12:00:36.483850067 +0000 UTC m=+123.075114034" Mar 11 12:00:37 crc kubenswrapper[4816]: I0311 12:00:37.165009 4816 patch_prober.go:28] interesting pod/router-default-5444994796-6m5gg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 12:00:37 crc kubenswrapper[4816]: [-]has-synced failed: reason withheld Mar 11 12:00:37 crc kubenswrapper[4816]: [+]process-running ok Mar 11 12:00:37 crc kubenswrapper[4816]: healthz check failed Mar 11 12:00:37 crc kubenswrapper[4816]: I0311 12:00:37.165072 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6m5gg" podUID="027b1711-77a0-4359-bd98-246217fdb5f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 12:00:37 crc kubenswrapper[4816]: I0311 12:00:37.480400 4816 generic.go:334] "Generic (PLEG): container finished" podID="729acc42-ae45-498b-8b45-a0307fa7951e" containerID="2e0d42da96a573093e348b66293d58e2cc0ebf0eda6f03cabc520124bd4d6901" exitCode=0 Mar 11 12:00:37 crc kubenswrapper[4816]: I0311 12:00:37.480490 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"729acc42-ae45-498b-8b45-a0307fa7951e","Type":"ContainerDied","Data":"2e0d42da96a573093e348b66293d58e2cc0ebf0eda6f03cabc520124bd4d6901"} Mar 11 12:00:38 crc kubenswrapper[4816]: I0311 12:00:38.178650 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:38 crc kubenswrapper[4816]: I0311 12:00:38.181304 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6m5gg" Mar 11 12:00:39 crc kubenswrapper[4816]: I0311 12:00:39.199032 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wgxgk" Mar 11 12:00:39 crc kubenswrapper[4816]: I0311 12:00:39.905158 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.516893 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dh658" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.678760 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.751784 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") pod \"729acc42-ae45-498b-8b45-a0307fa7951e\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.751881 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") pod \"729acc42-ae45-498b-8b45-a0307fa7951e\" (UID: \"729acc42-ae45-498b-8b45-a0307fa7951e\") " Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.751900 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "729acc42-ae45-498b-8b45-a0307fa7951e" (UID: "729acc42-ae45-498b-8b45-a0307fa7951e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.752109 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/729acc42-ae45-498b-8b45-a0307fa7951e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.759853 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "729acc42-ae45-498b-8b45-a0307fa7951e" (UID: "729acc42-ae45-498b-8b45-a0307fa7951e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:42 crc kubenswrapper[4816]: I0311 12:00:42.854850 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/729acc42-ae45-498b-8b45-a0307fa7951e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.530427 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"729acc42-ae45-498b-8b45-a0307fa7951e","Type":"ContainerDied","Data":"3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195"} Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.530465 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c3778da4fe076eac5dfb343e6905d7832d37f3e3f3b5af3c80125789356c195" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.530472 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.966802 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.973544 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") pod \"3c040a86-9614-48cb-9df7-14c83b046dce\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.973606 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") pod \"3c040a86-9614-48cb-9df7-14c83b046dce\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.973666 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") pod \"3c040a86-9614-48cb-9df7-14c83b046dce\" (UID: \"3c040a86-9614-48cb-9df7-14c83b046dce\") " Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.974488 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume" (OuterVolumeSpecName: "config-volume") pod "3c040a86-9614-48cb-9df7-14c83b046dce" (UID: "3c040a86-9614-48cb-9df7-14c83b046dce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.976150 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.979825 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh" (OuterVolumeSpecName: "kube-api-access-9hbdh") pod "3c040a86-9614-48cb-9df7-14c83b046dce" (UID: "3c040a86-9614-48cb-9df7-14c83b046dce"). InnerVolumeSpecName "kube-api-access-9hbdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:43 crc kubenswrapper[4816]: I0311 12:00:43.980488 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3c040a86-9614-48cb-9df7-14c83b046dce" (UID: "3c040a86-9614-48cb-9df7-14c83b046dce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.074821 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c040a86-9614-48cb-9df7-14c83b046dce-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.074853 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c040a86-9614-48cb-9df7-14c83b046dce-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.074863 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hbdh\" (UniqueName: \"kubernetes.io/projected/3c040a86-9614-48cb-9df7-14c83b046dce-kube-api-access-9hbdh\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.081361 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.089082 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.176982 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") pod \"d26fa831-2257-478d-a4dd-9b33c6a59198\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.177300 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") pod \"d26fa831-2257-478d-a4dd-9b33c6a59198\" (UID: \"d26fa831-2257-478d-a4dd-9b33c6a59198\") " Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.180001 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d26fa831-2257-478d-a4dd-9b33c6a59198" (UID: "d26fa831-2257-478d-a4dd-9b33c6a59198"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.189706 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d26fa831-2257-478d-a4dd-9b33c6a59198" (UID: "d26fa831-2257-478d-a4dd-9b33c6a59198"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.284974 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26fa831-2257-478d-a4dd-9b33c6a59198-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.285002 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26fa831-2257-478d-a4dd-9b33c6a59198-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.545834 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.546425 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52" event={"ID":"3c040a86-9614-48cb-9df7-14c83b046dce","Type":"ContainerDied","Data":"dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d"} Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.546503 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe4724e3bb10a60d2bcdde00ce0cce01eb1e6f17e7c5b379625a2f60d27762d" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.557143 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.557174 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d26fa831-2257-478d-a4dd-9b33c6a59198","Type":"ContainerDied","Data":"3cc4bff0538c91690233bfb9bb26cddd773478266affdc0cbf69df64ec4f1cf1"} Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.557222 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc4bff0538c91690233bfb9bb26cddd773478266affdc0cbf69df64ec4f1cf1" Mar 11 12:00:44 crc kubenswrapper[4816]: E0311 12:00:44.583970 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:44 crc kubenswrapper[4816]: E0311 12:00:44.585636 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:44 crc kubenswrapper[4816]: E0311 12:00:44.587485 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:44 crc kubenswrapper[4816]: E0311 12:00:44.587548 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:00:44 crc kubenswrapper[4816]: I0311 12:00:44.873959 4816 ???:1] "http: TLS handshake error from 192.168.126.11:52352: no serving certificate available for the kubelet" Mar 11 12:00:48 crc kubenswrapper[4816]: I0311 12:00:48.518816 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:48 crc kubenswrapper[4816]: I0311 12:00:48.519829 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" containerID="cri-o://b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710" gracePeriod=30 Mar 11 12:00:48 crc kubenswrapper[4816]: I0311 12:00:48.533180 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:00:48 crc kubenswrapper[4816]: I0311 12:00:48.533499 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" containerID="cri-o://25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1" gracePeriod=30 Mar 11 12:00:49 crc kubenswrapper[4816]: I0311 12:00:49.600380 4816 generic.go:334] "Generic (PLEG): container finished" podID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerID="25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1" exitCode=0 Mar 11 12:00:49 crc kubenswrapper[4816]: I0311 12:00:49.600412 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" event={"ID":"1d5c9149-6a85-4e50-9569-6cc828e55a11","Type":"ContainerDied","Data":"25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1"} Mar 11 12:00:49 crc kubenswrapper[4816]: I0311 12:00:49.602635 4816 generic.go:334] "Generic (PLEG): container finished" podID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerID="b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710" exitCode=0 Mar 11 12:00:49 crc kubenswrapper[4816]: I0311 12:00:49.602664 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" event={"ID":"c904faa8-338a-4f9c-80fc-bad9d60139a0","Type":"ContainerDied","Data":"b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710"} Mar 11 12:00:51 crc kubenswrapper[4816]: I0311 12:00:51.387024 4816 patch_prober.go:28] interesting pod/route-controller-manager-6fb4858c9f-v88nx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 11 12:00:51 crc kubenswrapper[4816]: I0311 12:00:51.387383 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 11 12:00:51 crc kubenswrapper[4816]: I0311 12:00:51.592223 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:00:52 crc kubenswrapper[4816]: I0311 12:00:52.400632 4816 patch_prober.go:28] interesting pod/controller-manager-7f7578748c-p527z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 12:00:52 crc kubenswrapper[4816]: I0311 12:00:52.400711 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:00:54 crc kubenswrapper[4816]: E0311 12:00:54.584687 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:54 crc kubenswrapper[4816]: E0311 12:00:54.586751 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:54 crc kubenswrapper[4816]: E0311 12:00:54.588333 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 11 12:00:54 crc kubenswrapper[4816]: E0311 12:00:54.588366 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.361376 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.369111 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.369372 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz9fg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9fv28_openshift-marketplace(8d6e662d-8633-4e55-baf3-50a2c4d179a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.371007 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9fv28" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.390584 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.391004 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391102 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.391171 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="729acc42-ae45-498b-8b45-a0307fa7951e" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391233 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="729acc42-ae45-498b-8b45-a0307fa7951e" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.391333 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c040a86-9614-48cb-9df7-14c83b046dce" containerName="collect-profiles" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391439 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c040a86-9614-48cb-9df7-14c83b046dce" containerName="collect-profiles" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.391510 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26fa831-2257-478d-a4dd-9b33c6a59198" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391569 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26fa831-2257-478d-a4dd-9b33c6a59198" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.391739 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" containerName="controller-manager" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.401817 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26fa831-2257-478d-a4dd-9b33c6a59198" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.401867 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="729acc42-ae45-498b-8b45-a0307fa7951e" containerName="pruner" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.401886 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c040a86-9614-48cb-9df7-14c83b046dce" containerName="collect-profiles" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.402608 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.407682 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.459124 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.459380 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vsxwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s2dh2_openshift-marketplace(756dd25b-5375-48bc-8578-a9585ef49e6c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.460540 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s2dh2" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.485362 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.485526 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xchpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jwq6f_openshift-marketplace(fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 12:00:58 crc kubenswrapper[4816]: E0311 12:00:58.486611 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jwq6f" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.541862 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.541919 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.541966 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.542012 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.542771 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.542805 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.542964 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config" (OuterVolumeSpecName: "config") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543017 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") pod \"c904faa8-338a-4f9c-80fc-bad9d60139a0\" (UID: \"c904faa8-338a-4f9c-80fc-bad9d60139a0\") " Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543239 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543365 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543394 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543430 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543477 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543532 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543604 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.543638 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c904faa8-338a-4f9c-80fc-bad9d60139a0-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.550535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql" (OuterVolumeSpecName: "kube-api-access-gpvql") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "kube-api-access-gpvql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.551361 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c904faa8-338a-4f9c-80fc-bad9d60139a0" (UID: "c904faa8-338a-4f9c-80fc-bad9d60139a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.645979 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646120 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646219 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646287 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646347 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpvql\" (UniqueName: \"kubernetes.io/projected/c904faa8-338a-4f9c-80fc-bad9d60139a0-kube-api-access-gpvql\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.646393 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c904faa8-338a-4f9c-80fc-bad9d60139a0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.647337 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.647637 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.648088 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.651154 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.668661 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") pod \"controller-manager-5c8cff94b6-x9hdr\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.703758 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.705216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7578748c-p527z" event={"ID":"c904faa8-338a-4f9c-80fc-bad9d60139a0","Type":"ContainerDied","Data":"868efa371ca880139b71d36617be11ed32ba7747cf0e6a8180c51bf10cbc179c"} Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.705339 4816 scope.go:117] "RemoveContainer" containerID="b15503803ae7b01b4347bf2f0cc032c1e2e36293189d891e7329a3636d682710" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.724152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.793135 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:00:58 crc kubenswrapper[4816]: I0311 12:00:58.796768 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f7578748c-p527z"] Mar 11 12:01:00 crc kubenswrapper[4816]: I0311 12:01:00.136762 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c904faa8-338a-4f9c-80fc-bad9d60139a0" path="/var/lib/kubelet/pods/c904faa8-338a-4f9c-80fc-bad9d60139a0/volumes" Mar 11 12:01:00 crc kubenswrapper[4816]: I0311 12:01:00.716337 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6bx5p_546d4851-e1c7-418b-8ba6-5847e5f9efde/kube-multus-additional-cni-plugins/0.log" Mar 11 12:01:00 crc kubenswrapper[4816]: I0311 12:01:00.716908 4816 generic.go:334] "Generic (PLEG): container finished" podID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" exitCode=137 Mar 11 12:01:00 crc kubenswrapper[4816]: I0311 12:01:00.716959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" event={"ID":"546d4851-e1c7-418b-8ba6-5847e5f9efde","Type":"ContainerDied","Data":"94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7"} Mar 11 12:01:01 crc kubenswrapper[4816]: E0311 12:01:01.370920 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9fv28" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" Mar 11 12:01:01 crc kubenswrapper[4816]: E0311 12:01:01.371013 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jwq6f" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" Mar 11 12:01:01 crc kubenswrapper[4816]: E0311 12:01:01.373303 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s2dh2" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.423159 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.455102 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:01 crc kubenswrapper[4816]: E0311 12:01:01.457549 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.457572 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.457663 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.458078 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.460054 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588597 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") pod \"1d5c9149-6a85-4e50-9569-6cc828e55a11\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588708 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") pod \"1d5c9149-6a85-4e50-9569-6cc828e55a11\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") pod \"1d5c9149-6a85-4e50-9569-6cc828e55a11\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588813 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") pod \"1d5c9149-6a85-4e50-9569-6cc828e55a11\" (UID: \"1d5c9149-6a85-4e50-9569-6cc828e55a11\") " Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588953 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.588991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.589024 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.589042 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.590202 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config" (OuterVolumeSpecName: "config") pod "1d5c9149-6a85-4e50-9569-6cc828e55a11" (UID: "1d5c9149-6a85-4e50-9569-6cc828e55a11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.590966 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d5c9149-6a85-4e50-9569-6cc828e55a11" (UID: "1d5c9149-6a85-4e50-9569-6cc828e55a11"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.594311 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d5c9149-6a85-4e50-9569-6cc828e55a11" (UID: "1d5c9149-6a85-4e50-9569-6cc828e55a11"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.594628 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx" (OuterVolumeSpecName: "kube-api-access-2gmfx") pod "1d5c9149-6a85-4e50-9569-6cc828e55a11" (UID: "1d5c9149-6a85-4e50-9569-6cc828e55a11"). InnerVolumeSpecName "kube-api-access-2gmfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690565 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690636 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690687 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690710 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690774 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690789 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5c9149-6a85-4e50-9569-6cc828e55a11-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690801 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gmfx\" (UniqueName: \"kubernetes.io/projected/1d5c9149-6a85-4e50-9569-6cc828e55a11-kube-api-access-2gmfx\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.690814 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5c9149-6a85-4e50-9569-6cc828e55a11-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.691935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.692818 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.694314 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.706011 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") pod \"route-controller-manager-54bddb5f44-mbxnl\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.723751 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" event={"ID":"1d5c9149-6a85-4e50-9569-6cc828e55a11","Type":"ContainerDied","Data":"93986dc36dc0ff7dd563d707eb2a02368f044d4c0b1cecd71fb46a82e85f624f"} Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.723795 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.754012 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.754073 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx"] Mar 11 12:01:01 crc kubenswrapper[4816]: I0311 12:01:01.776666 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.138894 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" path="/var/lib/kubelet/pods/1d5c9149-6a85-4e50-9569-6cc828e55a11/volumes" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.386130 4816 patch_prober.go:28] interesting pod/route-controller-manager-6fb4858c9f-v88nx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.386203 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fb4858c9f-v88nx" podUID="1d5c9149-6a85-4e50-9569-6cc828e55a11" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.859580 4816 scope.go:117] "RemoveContainer" containerID="25f0be79390049105752d95c1c8523ffd3475271c1e00a0aa23883ae8aa13fa1" Mar 11 12:01:02 crc kubenswrapper[4816]: E0311 12:01:02.904704 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 11 12:01:02 crc kubenswrapper[4816]: E0311 12:01:02.905276 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lvwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rlvrz_openshift-marketplace(e94af1b5-09ef-433f-91e6-7b352836273d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.906119 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6bx5p_546d4851-e1c7-418b-8ba6-5847e5f9efde/kube-multus-additional-cni-plugins/0.log" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.906187 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:01:02 crc kubenswrapper[4816]: E0311 12:01:02.906466 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rlvrz" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909124 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") pod \"546d4851-e1c7-418b-8ba6-5847e5f9efde\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909177 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") pod \"546d4851-e1c7-418b-8ba6-5847e5f9efde\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909240 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") pod \"546d4851-e1c7-418b-8ba6-5847e5f9efde\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909297 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") pod \"546d4851-e1c7-418b-8ba6-5847e5f9efde\" (UID: \"546d4851-e1c7-418b-8ba6-5847e5f9efde\") " Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.909821 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "546d4851-e1c7-418b-8ba6-5847e5f9efde" (UID: "546d4851-e1c7-418b-8ba6-5847e5f9efde"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.910147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "546d4851-e1c7-418b-8ba6-5847e5f9efde" (UID: "546d4851-e1c7-418b-8ba6-5847e5f9efde"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.910816 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready" (OuterVolumeSpecName: "ready") pod "546d4851-e1c7-418b-8ba6-5847e5f9efde" (UID: "546d4851-e1c7-418b-8ba6-5847e5f9efde"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:02 crc kubenswrapper[4816]: I0311 12:01:02.914371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g" (OuterVolumeSpecName: "kube-api-access-sxw9g") pod "546d4851-e1c7-418b-8ba6-5847e5f9efde" (UID: "546d4851-e1c7-418b-8ba6-5847e5f9efde"). InnerVolumeSpecName "kube-api-access-sxw9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.010847 4816 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/546d4851-e1c7-418b-8ba6-5847e5f9efde-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.010878 4816 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/546d4851-e1c7-418b-8ba6-5847e5f9efde-ready\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.010888 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxw9g\" (UniqueName: \"kubernetes.io/projected/546d4851-e1c7-418b-8ba6-5847e5f9efde-kube-api-access-sxw9g\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.010898 4816 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/546d4851-e1c7-418b-8ba6-5847e5f9efde-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.735989 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6bx5p_546d4851-e1c7-418b-8ba6-5847e5f9efde/kube-multus-additional-cni-plugins/0.log" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.736108 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.736095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6bx5p" event={"ID":"546d4851-e1c7-418b-8ba6-5847e5f9efde","Type":"ContainerDied","Data":"bb301579c908efd9a833ba2c76294edf97abc1c238aa669d3b8696cb61fa9a56"} Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.775707 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6bx5p"] Mar 11 12:01:03 crc kubenswrapper[4816]: I0311 12:01:03.776353 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6bx5p"] Mar 11 12:01:04 crc kubenswrapper[4816]: I0311 12:01:04.136582 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" path="/var/lib/kubelet/pods/546d4851-e1c7-418b-8ba6-5847e5f9efde/volumes" Mar 11 12:01:04 crc kubenswrapper[4816]: I0311 12:01:04.213816 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6t4jp" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.025106 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 12:01:06 crc kubenswrapper[4816]: E0311 12:01:06.025694 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.025708 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.025814 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="546d4851-e1c7-418b-8ba6-5847e5f9efde" containerName="kube-multus-additional-cni-plugins" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.026298 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.028440 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.028684 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.044391 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.048914 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.048967 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.149698 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.149743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.150127 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.176937 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:06 crc kubenswrapper[4816]: I0311 12:01:06.352325 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:07 crc kubenswrapper[4816]: E0311 12:01:07.070835 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rlvrz" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.445290 4816 scope.go:117] "RemoveContainer" containerID="94fc872f9120ae3b6c5bc8d7ce09def109b21a972702c2d063763160e11c44c7" Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.666404 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.696356 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.764471 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" event={"ID":"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1","Type":"ContainerStarted","Data":"4d23328b31768c096f02f39298c1b22ed736efa862708623ddbcd093bc7ea791"} Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.765442 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 12:01:07 crc kubenswrapper[4816]: I0311 12:01:07.766083 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" event={"ID":"27a893ee-c824-4b3a-a1a7-270040291753","Type":"ContainerStarted","Data":"6d8a80e724695a853b72862fcfc4d7a6e02121bd48950e2a771b5ef4a04ee4b8"} Mar 11 12:01:07 crc kubenswrapper[4816]: W0311 12:01:07.777529 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1a8048f1_34ce_48b3_a273_bc4905efd9a0.slice/crio-a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd WatchSource:0}: Error finding container a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd: Status 404 returned error can't find the container with id a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.612294 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.707675 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.787994 4816 generic.go:334] "Generic (PLEG): container finished" podID="34f226df-3352-4423-822c-67891ad3a398" containerID="796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382" exitCode=0 Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.788095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerDied","Data":"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.791358 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" event={"ID":"27a893ee-c824-4b3a-a1a7-270040291753","Type":"ContainerStarted","Data":"e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.792445 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.794607 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerStarted","Data":"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.799455 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a8048f1-34ce-48b3-a273-bc4905efd9a0","Type":"ContainerStarted","Data":"6f08b7a0dd010a2b212934117f9352279e1a3b6e57801752f9f48c9d6215f346"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.799497 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a8048f1-34ce-48b3-a273-bc4905efd9a0","Type":"ContainerStarted","Data":"a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.801771 4816 generic.go:334] "Generic (PLEG): container finished" podID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerID="ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23" exitCode=0 Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.801843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerDied","Data":"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.805394 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerStarted","Data":"a7c62a5bd8897be83a617ee46b4e99b960de1d3c06824d144ebcc6c092953124"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.806064 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.815983 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" event={"ID":"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1","Type":"ContainerStarted","Data":"fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9"} Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.816499 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.825388 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.861319 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" podStartSLOduration=20.861303283 podStartE2EDuration="20.861303283s" podCreationTimestamp="2026-03-11 12:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:08.85980324 +0000 UTC m=+155.451067207" watchObservedRunningTime="2026-03-11 12:01:08.861303283 +0000 UTC m=+155.452567250" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.924193 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.924173552 podStartE2EDuration="2.924173552s" podCreationTimestamp="2026-03-11 12:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:08.921732092 +0000 UTC m=+155.512996059" watchObservedRunningTime="2026-03-11 12:01:08.924173552 +0000 UTC m=+155.515437519" Mar 11 12:01:08 crc kubenswrapper[4816]: I0311 12:01:08.944657 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" podStartSLOduration=20.944639338 podStartE2EDuration="20.944639338s" podCreationTimestamp="2026-03-11 12:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:08.944185245 +0000 UTC m=+155.535449232" watchObservedRunningTime="2026-03-11 12:01:08.944639338 +0000 UTC m=+155.535903305" Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.149090 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.824160 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a8048f1-34ce-48b3-a273-bc4905efd9a0" containerID="6f08b7a0dd010a2b212934117f9352279e1a3b6e57801752f9f48c9d6215f346" exitCode=0 Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.824278 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a8048f1-34ce-48b3-a273-bc4905efd9a0","Type":"ContainerDied","Data":"6f08b7a0dd010a2b212934117f9352279e1a3b6e57801752f9f48c9d6215f346"} Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.824433 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerName="route-controller-manager" containerID="cri-o://fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9" gracePeriod=30 Mar 11 12:01:09 crc kubenswrapper[4816]: I0311 12:01:09.824983 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" podUID="27a893ee-c824-4b3a-a1a7-270040291753" containerName="controller-manager" containerID="cri-o://e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec" gracePeriod=30 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.838081 4816 generic.go:334] "Generic (PLEG): container finished" podID="ce281163-d6c0-444b-ba55-b488dd77b853" containerID="a7c62a5bd8897be83a617ee46b4e99b960de1d3c06824d144ebcc6c092953124" exitCode=0 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.838143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerDied","Data":"a7c62a5bd8897be83a617ee46b4e99b960de1d3c06824d144ebcc6c092953124"} Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.840152 4816 generic.go:334] "Generic (PLEG): container finished" podID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerID="fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9" exitCode=0 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.840286 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" event={"ID":"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1","Type":"ContainerDied","Data":"fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9"} Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.842436 4816 generic.go:334] "Generic (PLEG): container finished" podID="27a893ee-c824-4b3a-a1a7-270040291753" containerID="e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec" exitCode=0 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.842485 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" event={"ID":"27a893ee-c824-4b3a-a1a7-270040291753","Type":"ContainerDied","Data":"e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec"} Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.844809 4816 generic.go:334] "Generic (PLEG): container finished" podID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerID="278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a" exitCode=0 Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.844970 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerDied","Data":"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a"} Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.965484 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.971974 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.992793 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:10 crc kubenswrapper[4816]: E0311 12:01:10.993000 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a893ee-c824-4b3a-a1a7-270040291753" containerName="controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993011 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a893ee-c824-4b3a-a1a7-270040291753" containerName="controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: E0311 12:01:10.993022 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerName="route-controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993028 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerName="route-controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993122 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a893ee-c824-4b3a-a1a7-270040291753" containerName="controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993132 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" containerName="route-controller-manager" Mar 11 12:01:10 crc kubenswrapper[4816]: I0311 12:01:10.993559 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.013069 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041303 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041371 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041411 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041469 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041509 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") pod \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041540 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") pod \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041578 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") pod \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041629 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") pod \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\" (UID: \"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041686 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") pod \"27a893ee-c824-4b3a-a1a7-270040291753\" (UID: \"27a893ee-c824-4b3a-a1a7-270040291753\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041880 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.041946 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.042018 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.042071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.042313 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config" (OuterVolumeSpecName: "config") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.043320 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.043396 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca" (OuterVolumeSpecName: "client-ca") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.043564 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" (UID: "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.043666 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config" (OuterVolumeSpecName: "config") pod "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" (UID: "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.049760 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" (UID: "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.049800 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p" (OuterVolumeSpecName: "kube-api-access-fgx8p") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "kube-api-access-fgx8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.053873 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695" (OuterVolumeSpecName: "kube-api-access-s5695") pod "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" (UID: "157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1"). InnerVolumeSpecName "kube-api-access-s5695". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.054291 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "27a893ee-c824-4b3a-a1a7-270040291753" (UID: "27a893ee-c824-4b3a-a1a7-270040291753"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.080727 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.143552 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") pod \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.143915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") pod \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\" (UID: \"1a8048f1-34ce-48b3-a273-bc4905efd9a0\") " Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144144 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144228 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144284 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144297 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144308 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144315 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a893ee-c824-4b3a-a1a7-270040291753-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144326 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgx8p\" (UniqueName: \"kubernetes.io/projected/27a893ee-c824-4b3a-a1a7-270040291753-kube-api-access-fgx8p\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144336 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27a893ee-c824-4b3a-a1a7-270040291753-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144344 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144353 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.144362 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5695\" (UniqueName: \"kubernetes.io/projected/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1-kube-api-access-s5695\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.145140 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.145764 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1a8048f1-34ce-48b3-a273-bc4905efd9a0" (UID: "1a8048f1-34ce-48b3-a273-bc4905efd9a0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.146680 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.149555 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1a8048f1-34ce-48b3-a273-bc4905efd9a0" (UID: "1a8048f1-34ce-48b3-a273-bc4905efd9a0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.149901 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.160282 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") pod \"route-controller-manager-5d755bd99c-vmbvk\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.245007 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.245041 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a8048f1-34ce-48b3-a273-bc4905efd9a0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.319421 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.528471 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:11 crc kubenswrapper[4816]: W0311 12:01:11.547005 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode00e505e_4736_4aee_b340_ef223d36cf41.slice/crio-867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e WatchSource:0}: Error finding container 867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e: Status 404 returned error can't find the container with id 867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.854587 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" event={"ID":"157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1","Type":"ContainerDied","Data":"4d23328b31768c096f02f39298c1b22ed736efa862708623ddbcd093bc7ea791"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.854647 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.854656 4816 scope.go:117] "RemoveContainer" containerID="fd8060a81740d2d82d00ee7b672322ac337e0a0886f149a3bf4e8ecff6b410c9" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.857791 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.857807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr" event={"ID":"27a893ee-c824-4b3a-a1a7-270040291753","Type":"ContainerDied","Data":"6d8a80e724695a853b72862fcfc4d7a6e02121bd48950e2a771b5ef4a04ee4b8"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.864679 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerStarted","Data":"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.867485 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" event={"ID":"e00e505e-4736-4aee-b340-ef223d36cf41","Type":"ContainerStarted","Data":"867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.869867 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a8048f1-34ce-48b3-a273-bc4905efd9a0","Type":"ContainerDied","Data":"a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd"} Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.869894 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a59dfcea4654114bb3d368003f90d6affcd94be583c75e559ea9e65d751021dd" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.869913 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.883320 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cg4jl" podStartSLOduration=2.044149483 podStartE2EDuration="38.883288173s" podCreationTimestamp="2026-03-11 12:00:33 +0000 UTC" firstStartedPulling="2026-03-11 12:00:34.350552266 +0000 UTC m=+120.941816233" lastFinishedPulling="2026-03-11 12:01:11.189690946 +0000 UTC m=+157.780954923" observedRunningTime="2026-03-11 12:01:11.880633867 +0000 UTC m=+158.471897844" watchObservedRunningTime="2026-03-11 12:01:11.883288173 +0000 UTC m=+158.474552140" Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.901717 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.909098 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c8cff94b6-x9hdr"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.912150 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.914600 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bddb5f44-mbxnl"] Mar 11 12:01:11 crc kubenswrapper[4816]: I0311 12:01:11.923784 4816 scope.go:117] "RemoveContainer" containerID="e34c7fd51c419ebcbf2509f97778647782961f1976493ea35a4a759ea50660ec" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.137280 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1" path="/var/lib/kubelet/pods/157ddd4f-3bcf-4bbc-9f7c-693ed29c56f1/volumes" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.137978 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a893ee-c824-4b3a-a1a7-270040291753" path="/var/lib/kubelet/pods/27a893ee-c824-4b3a-a1a7-270040291753/volumes" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.822933 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 12:01:12 crc kubenswrapper[4816]: E0311 12:01:12.823445 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8048f1-34ce-48b3-a273-bc4905efd9a0" containerName="pruner" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.823461 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8048f1-34ce-48b3-a273-bc4905efd9a0" containerName="pruner" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.823565 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8048f1-34ce-48b3-a273-bc4905efd9a0" containerName="pruner" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.823956 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.825685 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.828541 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.835129 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.869064 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.869130 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.869163 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.875040 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerStarted","Data":"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56"} Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.880361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" event={"ID":"e00e505e-4736-4aee-b340-ef223d36cf41","Type":"ContainerStarted","Data":"c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2"} Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.909496 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ndrbx" podStartSLOduration=3.234815295 podStartE2EDuration="41.909478917s" podCreationTimestamp="2026-03-11 12:00:31 +0000 UTC" firstStartedPulling="2026-03-11 12:00:33.249060988 +0000 UTC m=+119.840324955" lastFinishedPulling="2026-03-11 12:01:11.92372461 +0000 UTC m=+158.514988577" observedRunningTime="2026-03-11 12:01:12.905240246 +0000 UTC m=+159.496504213" watchObservedRunningTime="2026-03-11 12:01:12.909478917 +0000 UTC m=+159.500742884" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.919430 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" podStartSLOduration=4.919413621 podStartE2EDuration="4.919413621s" podCreationTimestamp="2026-03-11 12:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:12.917199058 +0000 UTC m=+159.508463035" watchObservedRunningTime="2026-03-11 12:01:12.919413621 +0000 UTC m=+159.510677588" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.970463 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.970557 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.970665 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.971458 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:12 crc kubenswrapper[4816]: I0311 12:01:12.971523 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.003223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") pod \"installer-9-crc\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.083800 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.084422 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089318 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089464 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089556 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089645 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089709 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.089910 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.094736 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.096786 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173076 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173101 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173359 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.173601 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.190444 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.275810 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276028 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276278 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276319 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276346 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.276393 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.277524 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.277747 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.290133 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.296632 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") pod \"controller-manager-568c67664d-76gf4\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.368644 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.368706 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.424073 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.690504 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.753890 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.886824 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"106a80c4-7132-43b4-930f-bd886787437f","Type":"ContainerStarted","Data":"b87f445ca27d573faee92ddd515c624b2b710e714f620c36718ab43fc1a2134f"} Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.888982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" event={"ID":"0eabc434-3f96-4124-9afc-ecb2466f2104","Type":"ContainerStarted","Data":"47dc6f78b21758a900162faa2d749022ed0a0a88a3ab047d3b81d585e1f50879"} Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.891182 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerStarted","Data":"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a"} Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.891906 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.896763 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:13 crc kubenswrapper[4816]: I0311 12:01:13.910569 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8r7jt" podStartSLOduration=9.346380039 podStartE2EDuration="39.910555191s" podCreationTimestamp="2026-03-11 12:00:34 +0000 UTC" firstStartedPulling="2026-03-11 12:00:42.620613312 +0000 UTC m=+129.211877279" lastFinishedPulling="2026-03-11 12:01:13.184788464 +0000 UTC m=+159.776052431" observedRunningTime="2026-03-11 12:01:13.90738175 +0000 UTC m=+160.498645717" watchObservedRunningTime="2026-03-11 12:01:13.910555191 +0000 UTC m=+160.501819158" Mar 11 12:01:14 crc kubenswrapper[4816]: I0311 12:01:14.570679 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:14 crc kubenswrapper[4816]: I0311 12:01:14.570984 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:14 crc kubenswrapper[4816]: I0311 12:01:14.961120 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-cg4jl" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:14 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:14 crc kubenswrapper[4816]: > Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.636204 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8r7jt" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:15 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:15 crc kubenswrapper[4816]: > Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.902475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerStarted","Data":"b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f"} Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.903505 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"106a80c4-7132-43b4-930f-bd886787437f","Type":"ContainerStarted","Data":"13cc1621a3a1352dc36083505ef9245a833ca0fab13f1b74079c751c4ed90659"} Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.904778 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" event={"ID":"0eabc434-3f96-4124-9afc-ecb2466f2104","Type":"ContainerStarted","Data":"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7"} Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.919926 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtm2c" podStartSLOduration=11.119883892 podStartE2EDuration="42.919907186s" podCreationTimestamp="2026-03-11 12:00:33 +0000 UTC" firstStartedPulling="2026-03-11 12:00:42.612537541 +0000 UTC m=+129.203801508" lastFinishedPulling="2026-03-11 12:01:14.412560835 +0000 UTC m=+161.003824802" observedRunningTime="2026-03-11 12:01:15.916746125 +0000 UTC m=+162.508010092" watchObservedRunningTime="2026-03-11 12:01:15.919907186 +0000 UTC m=+162.511171163" Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.941751 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.950155 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" podStartSLOduration=7.950139691 podStartE2EDuration="7.950139691s" podCreationTimestamp="2026-03-11 12:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:15.948594026 +0000 UTC m=+162.539857993" watchObservedRunningTime="2026-03-11 12:01:15.950139691 +0000 UTC m=+162.541403658" Mar 11 12:01:15 crc kubenswrapper[4816]: I0311 12:01:15.982419 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.982403904 podStartE2EDuration="3.982403904s" podCreationTimestamp="2026-03-11 12:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:15.978102601 +0000 UTC m=+162.569366588" watchObservedRunningTime="2026-03-11 12:01:15.982403904 +0000 UTC m=+162.573667871" Mar 11 12:01:16 crc kubenswrapper[4816]: I0311 12:01:16.910157 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:16 crc kubenswrapper[4816]: I0311 12:01:16.915869 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:17 crc kubenswrapper[4816]: I0311 12:01:17.915659 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerStarted","Data":"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca"} Mar 11 12:01:17 crc kubenswrapper[4816]: I0311 12:01:17.917037 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerStarted","Data":"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247"} Mar 11 12:01:17 crc kubenswrapper[4816]: I0311 12:01:17.919233 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerStarted","Data":"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17"} Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.931357 4816 generic.go:334] "Generic (PLEG): container finished" podID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerID="af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247" exitCode=0 Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.931441 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerDied","Data":"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247"} Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.934785 4816 generic.go:334] "Generic (PLEG): container finished" podID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerID="4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17" exitCode=0 Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.934858 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerDied","Data":"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17"} Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.936782 4816 generic.go:334] "Generic (PLEG): container finished" podID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerID="3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca" exitCode=0 Mar 11 12:01:19 crc kubenswrapper[4816]: I0311 12:01:19.936808 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerDied","Data":"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca"} Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.944807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerStarted","Data":"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc"} Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.949290 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerStarted","Data":"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783"} Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.951433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerStarted","Data":"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98"} Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.962401 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.962467 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:01:20 crc kubenswrapper[4816]: I0311 12:01:20.980806 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2dh2" podStartSLOduration=2.78523218 podStartE2EDuration="49.980783516s" podCreationTimestamp="2026-03-11 12:00:31 +0000 UTC" firstStartedPulling="2026-03-11 12:00:33.235817969 +0000 UTC m=+119.827081946" lastFinishedPulling="2026-03-11 12:01:20.431369315 +0000 UTC m=+167.022633282" observedRunningTime="2026-03-11 12:01:20.978492871 +0000 UTC m=+167.569756858" watchObservedRunningTime="2026-03-11 12:01:20.980783516 +0000 UTC m=+167.572047483" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.005754 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jwq6f" podStartSLOduration=2.726452054 podStartE2EDuration="51.0057348s" podCreationTimestamp="2026-03-11 12:00:30 +0000 UTC" firstStartedPulling="2026-03-11 12:00:32.110440418 +0000 UTC m=+118.701704385" lastFinishedPulling="2026-03-11 12:01:20.389723164 +0000 UTC m=+166.980987131" observedRunningTime="2026-03-11 12:01:21.003331831 +0000 UTC m=+167.594595808" watchObservedRunningTime="2026-03-11 12:01:21.0057348 +0000 UTC m=+167.596998767" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.023947 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9fv28" podStartSLOduration=2.80319518 podStartE2EDuration="51.023929961s" podCreationTimestamp="2026-03-11 12:00:30 +0000 UTC" firstStartedPulling="2026-03-11 12:00:32.115173573 +0000 UTC m=+118.706437540" lastFinishedPulling="2026-03-11 12:01:20.335908354 +0000 UTC m=+166.927172321" observedRunningTime="2026-03-11 12:01:21.02251217 +0000 UTC m=+167.613776137" watchObservedRunningTime="2026-03-11 12:01:21.023929961 +0000 UTC m=+167.615193928" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.183876 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.184038 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.363599 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.363649 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.648523 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.648571 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.718864 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.959092 4816 generic.go:334] "Generic (PLEG): container finished" podID="e94af1b5-09ef-433f-91e6-7b352836273d" containerID="e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941" exitCode=0 Mar 11 12:01:21 crc kubenswrapper[4816]: I0311 12:01:21.959188 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerDied","Data":"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941"} Mar 11 12:01:22 crc kubenswrapper[4816]: I0311 12:01:22.011067 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:22 crc kubenswrapper[4816]: I0311 12:01:22.013948 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9fv28" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:22 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:22 crc kubenswrapper[4816]: > Mar 11 12:01:22 crc kubenswrapper[4816]: I0311 12:01:22.269534 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jwq6f" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:22 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:22 crc kubenswrapper[4816]: > Mar 11 12:01:22 crc kubenswrapper[4816]: I0311 12:01:22.402484 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s2dh2" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" probeResult="failure" output=< Mar 11 12:01:22 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:01:22 crc kubenswrapper[4816]: > Mar 11 12:01:23 crc kubenswrapper[4816]: I0311 12:01:23.417825 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:23 crc kubenswrapper[4816]: I0311 12:01:23.460431 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:23 crc kubenswrapper[4816]: I0311 12:01:23.975221 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerStarted","Data":"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17"} Mar 11 12:01:23 crc kubenswrapper[4816]: I0311 12:01:23.995873 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rlvrz" podStartSLOduration=3.542966326 podStartE2EDuration="51.995852678s" podCreationTimestamp="2026-03-11 12:00:32 +0000 UTC" firstStartedPulling="2026-03-11 12:00:34.293901945 +0000 UTC m=+120.885165912" lastFinishedPulling="2026-03-11 12:01:22.746788297 +0000 UTC m=+169.338052264" observedRunningTime="2026-03-11 12:01:23.994582752 +0000 UTC m=+170.585846759" watchObservedRunningTime="2026-03-11 12:01:23.995852678 +0000 UTC m=+170.587116645" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.032051 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.202616 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.203088 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.230868 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.231228 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ndrbx" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="registry-server" containerID="cri-o://53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" gracePeriod=2 Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.247763 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.623734 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.673336 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.704308 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.737140 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") pod \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.737548 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") pod \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.737596 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") pod \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\" (UID: \"ffe46307-0d92-4864-9aa4-b0ca2fc641d0\") " Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.738651 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities" (OuterVolumeSpecName: "utilities") pod "ffe46307-0d92-4864-9aa4-b0ca2fc641d0" (UID: "ffe46307-0d92-4864-9aa4-b0ca2fc641d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.743173 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg" (OuterVolumeSpecName: "kube-api-access-tgdtg") pod "ffe46307-0d92-4864-9aa4-b0ca2fc641d0" (UID: "ffe46307-0d92-4864-9aa4-b0ca2fc641d0"). InnerVolumeSpecName "kube-api-access-tgdtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.792630 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffe46307-0d92-4864-9aa4-b0ca2fc641d0" (UID: "ffe46307-0d92-4864-9aa4-b0ca2fc641d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.839138 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.839185 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.839202 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgdtg\" (UniqueName: \"kubernetes.io/projected/ffe46307-0d92-4864-9aa4-b0ca2fc641d0-kube-api-access-tgdtg\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.980950 4816 generic.go:334] "Generic (PLEG): container finished" podID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerID="53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" exitCode=0 Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.980992 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerDied","Data":"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56"} Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.981035 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndrbx" event={"ID":"ffe46307-0d92-4864-9aa4-b0ca2fc641d0","Type":"ContainerDied","Data":"496964d22446ecfb6c504cae509de586a0cc99c038e5375b83e1db6c09ad3706"} Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.981056 4816 scope.go:117] "RemoveContainer" containerID="53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.981088 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndrbx" Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.981500 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cg4jl" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" containerID="cri-o://db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" gracePeriod=2 Mar 11 12:01:24 crc kubenswrapper[4816]: I0311 12:01:24.998122 4816 scope.go:117] "RemoveContainer" containerID="ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.013336 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.015996 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ndrbx"] Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.029838 4816 scope.go:117] "RemoveContainer" containerID="db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.029930 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.088910 4816 scope.go:117] "RemoveContainer" containerID="53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" Mar 11 12:01:25 crc kubenswrapper[4816]: E0311 12:01:25.089273 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56\": container with ID starting with 53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56 not found: ID does not exist" containerID="53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089312 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56"} err="failed to get container status \"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56\": rpc error: code = NotFound desc = could not find container \"53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56\": container with ID starting with 53861abbf276133c162d226cfa525d384ef03f62fc102777bee0f663f20bdb56 not found: ID does not exist" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089336 4816 scope.go:117] "RemoveContainer" containerID="ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23" Mar 11 12:01:25 crc kubenswrapper[4816]: E0311 12:01:25.089682 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23\": container with ID starting with ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23 not found: ID does not exist" containerID="ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089709 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23"} err="failed to get container status \"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23\": rpc error: code = NotFound desc = could not find container \"ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23\": container with ID starting with ad1b4e0d64cb2dee4b17f813f254e7a567bae3986d0eb4252bf8e91b220edf23 not found: ID does not exist" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089723 4816 scope.go:117] "RemoveContainer" containerID="db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80" Mar 11 12:01:25 crc kubenswrapper[4816]: E0311 12:01:25.089933 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80\": container with ID starting with db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80 not found: ID does not exist" containerID="db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.089953 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80"} err="failed to get container status \"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80\": rpc error: code = NotFound desc = could not find container \"db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80\": container with ID starting with db5556901c7abc5e94f121924c25248aaeceba682acde63ea94811a5a4dd7b80 not found: ID does not exist" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.439094 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.548087 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") pod \"34f226df-3352-4423-822c-67891ad3a398\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.548141 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") pod \"34f226df-3352-4423-822c-67891ad3a398\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.548186 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") pod \"34f226df-3352-4423-822c-67891ad3a398\" (UID: \"34f226df-3352-4423-822c-67891ad3a398\") " Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.548984 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities" (OuterVolumeSpecName: "utilities") pod "34f226df-3352-4423-822c-67891ad3a398" (UID: "34f226df-3352-4423-822c-67891ad3a398"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.558565 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p" (OuterVolumeSpecName: "kube-api-access-zp48p") pod "34f226df-3352-4423-822c-67891ad3a398" (UID: "34f226df-3352-4423-822c-67891ad3a398"). InnerVolumeSpecName "kube-api-access-zp48p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.576484 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34f226df-3352-4423-822c-67891ad3a398" (UID: "34f226df-3352-4423-822c-67891ad3a398"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.649849 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.649882 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f226df-3352-4423-822c-67891ad3a398-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.649894 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp48p\" (UniqueName: \"kubernetes.io/projected/34f226df-3352-4423-822c-67891ad3a398-kube-api-access-zp48p\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.856282 4816 ???:1] "http: TLS handshake error from 192.168.126.11:48634: no serving certificate available for the kubelet" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992693 4816 generic.go:334] "Generic (PLEG): container finished" podID="34f226df-3352-4423-822c-67891ad3a398" containerID="db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" exitCode=0 Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992758 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg4jl" Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerDied","Data":"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792"} Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992835 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg4jl" event={"ID":"34f226df-3352-4423-822c-67891ad3a398","Type":"ContainerDied","Data":"d0c2bd9596db386896db7ff2b9f9f3f47d22ce171a97edaa6f0b88c2da2cae3b"} Mar 11 12:01:25 crc kubenswrapper[4816]: I0311 12:01:25.992870 4816 scope.go:117] "RemoveContainer" containerID="db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.008959 4816 scope.go:117] "RemoveContainer" containerID="796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.019229 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.021854 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg4jl"] Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.047271 4816 scope.go:117] "RemoveContainer" containerID="8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.059534 4816 scope.go:117] "RemoveContainer" containerID="db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" Mar 11 12:01:26 crc kubenswrapper[4816]: E0311 12:01:26.060009 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792\": container with ID starting with db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792 not found: ID does not exist" containerID="db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060057 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792"} err="failed to get container status \"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792\": rpc error: code = NotFound desc = could not find container \"db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792\": container with ID starting with db4ddcf6a4270a18cdecb4d0ae5c1351dad3710879661ddc36a379be1474f792 not found: ID does not exist" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060087 4816 scope.go:117] "RemoveContainer" containerID="796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382" Mar 11 12:01:26 crc kubenswrapper[4816]: E0311 12:01:26.060524 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382\": container with ID starting with 796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382 not found: ID does not exist" containerID="796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060552 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382"} err="failed to get container status \"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382\": rpc error: code = NotFound desc = could not find container \"796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382\": container with ID starting with 796e9f8f75ca6804fd156b8767c81403ef730e702d67b109de63a9348c222382 not found: ID does not exist" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060574 4816 scope.go:117] "RemoveContainer" containerID="8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384" Mar 11 12:01:26 crc kubenswrapper[4816]: E0311 12:01:26.060833 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384\": container with ID starting with 8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384 not found: ID does not exist" containerID="8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.060853 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384"} err="failed to get container status \"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384\": rpc error: code = NotFound desc = could not find container \"8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384\": container with ID starting with 8d741bef2acf64d21213a84551d7a6097e01d2bf4cbc277a3b9b65411d993384 not found: ID does not exist" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.136979 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f226df-3352-4423-822c-67891ad3a398" path="/var/lib/kubelet/pods/34f226df-3352-4423-822c-67891ad3a398/volumes" Mar 11 12:01:26 crc kubenswrapper[4816]: I0311 12:01:26.137665 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" path="/var/lib/kubelet/pods/ffe46307-0d92-4864-9aa4-b0ca2fc641d0/volumes" Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.527330 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.527926 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerName="controller-manager" containerID="cri-o://422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" gracePeriod=30 Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.537652 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.537978 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" containerName="route-controller-manager" containerID="cri-o://c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2" gracePeriod=30 Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.626547 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:01:28 crc kubenswrapper[4816]: I0311 12:01:28.626759 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8r7jt" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" containerID="cri-o://3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" gracePeriod=2 Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.014295 4816 generic.go:334] "Generic (PLEG): container finished" podID="e00e505e-4736-4aee-b340-ef223d36cf41" containerID="c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2" exitCode=0 Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.014378 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" event={"ID":"e00e505e-4736-4aee-b340-ef223d36cf41","Type":"ContainerDied","Data":"c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2"} Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.843297 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.849986 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.855892 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.874513 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876280 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876305 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876325 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876332 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876345 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876354 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876363 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876369 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876377 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876383 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876395 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" containerName="route-controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876402 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" containerName="route-controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876412 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876420 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876432 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876441 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="extract-utilities" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876453 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876463 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876477 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876486 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="extract-content" Mar 11 12:01:29 crc kubenswrapper[4816]: E0311 12:01:29.876496 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerName="controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876506 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerName="controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876734 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe46307-0d92-4864-9aa4-b0ca2fc641d0" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876751 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876760 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerName="controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876770 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" containerName="route-controller-manager" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.876783 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f226df-3352-4423-822c-67891ad3a398" containerName="registry-server" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.877404 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.895462 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.901874 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.901936 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") pod \"e00e505e-4736-4aee-b340-ef223d36cf41\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.901967 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902011 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") pod \"e00e505e-4736-4aee-b340-ef223d36cf41\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902045 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902085 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") pod \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902122 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902165 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") pod \"e00e505e-4736-4aee-b340-ef223d36cf41\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902271 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") pod \"0eabc434-3f96-4124-9afc-ecb2466f2104\" (UID: \"0eabc434-3f96-4124-9afc-ecb2466f2104\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902312 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") pod \"e00e505e-4736-4aee-b340-ef223d36cf41\" (UID: \"e00e505e-4736-4aee-b340-ef223d36cf41\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902345 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") pod \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902376 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") pod \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\" (UID: \"d06617bd-ff11-42b8-9b84-e856c8c3c9eb\") " Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902690 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902727 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902780 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.902848 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.903566 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca" (OuterVolumeSpecName: "client-ca") pod "e00e505e-4736-4aee-b340-ef223d36cf41" (UID: "e00e505e-4736-4aee-b340-ef223d36cf41"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.903605 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config" (OuterVolumeSpecName: "config") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.904133 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities" (OuterVolumeSpecName: "utilities") pod "d06617bd-ff11-42b8-9b84-e856c8c3c9eb" (UID: "d06617bd-ff11-42b8-9b84-e856c8c3c9eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.904992 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.905156 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config" (OuterVolumeSpecName: "config") pod "e00e505e-4736-4aee-b340-ef223d36cf41" (UID: "e00e505e-4736-4aee-b340-ef223d36cf41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.906873 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca" (OuterVolumeSpecName: "client-ca") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.908610 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.913866 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs" (OuterVolumeSpecName: "kube-api-access-lqzfs") pod "e00e505e-4736-4aee-b340-ef223d36cf41" (UID: "e00e505e-4736-4aee-b340-ef223d36cf41"). InnerVolumeSpecName "kube-api-access-lqzfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.914052 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp" (OuterVolumeSpecName: "kube-api-access-pklmp") pod "0eabc434-3f96-4124-9afc-ecb2466f2104" (UID: "0eabc434-3f96-4124-9afc-ecb2466f2104"). InnerVolumeSpecName "kube-api-access-pklmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.924450 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4" (OuterVolumeSpecName: "kube-api-access-dgms4") pod "d06617bd-ff11-42b8-9b84-e856c8c3c9eb" (UID: "d06617bd-ff11-42b8-9b84-e856c8c3c9eb"). InnerVolumeSpecName "kube-api-access-dgms4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:29 crc kubenswrapper[4816]: I0311 12:01:29.925426 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e00e505e-4736-4aee-b340-ef223d36cf41" (UID: "e00e505e-4736-4aee-b340-ef223d36cf41"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.003878 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.003955 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.003981 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004018 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004062 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004074 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e00e505e-4736-4aee-b340-ef223d36cf41-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004083 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgms4\" (UniqueName: \"kubernetes.io/projected/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-kube-api-access-dgms4\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004094 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pklmp\" (UniqueName: \"kubernetes.io/projected/0eabc434-3f96-4124-9afc-ecb2466f2104-kube-api-access-pklmp\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004102 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004112 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eabc434-3f96-4124-9afc-ecb2466f2104-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004122 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00e505e-4736-4aee-b340-ef223d36cf41-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004134 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004261 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004405 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0eabc434-3f96-4124-9afc-ecb2466f2104-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.004449 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzfs\" (UniqueName: \"kubernetes.io/projected/e00e505e-4736-4aee-b340-ef223d36cf41-kube-api-access-lqzfs\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.005833 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.006004 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.007413 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021225 4816 generic.go:334] "Generic (PLEG): container finished" podID="0eabc434-3f96-4124-9afc-ecb2466f2104" containerID="422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" exitCode=0 Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021318 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" event={"ID":"0eabc434-3f96-4124-9afc-ecb2466f2104","Type":"ContainerDied","Data":"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021351 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" event={"ID":"0eabc434-3f96-4124-9afc-ecb2466f2104","Type":"ContainerDied","Data":"47dc6f78b21758a900162faa2d749022ed0a0a88a3ab047d3b81d585e1f50879"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021368 4816 scope.go:117] "RemoveContainer" containerID="422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.021489 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-568c67664d-76gf4" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.033798 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") pod \"route-controller-manager-74bdd87f68-49lvw\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.036282 4816 generic.go:334] "Generic (PLEG): container finished" podID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" containerID="3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" exitCode=0 Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.036430 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r7jt" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.036393 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerDied","Data":"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.036481 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r7jt" event={"ID":"d06617bd-ff11-42b8-9b84-e856c8c3c9eb","Type":"ContainerDied","Data":"955eae43fa10fa99fdb4e5d4b56b2dccf5fef672fa575257d946cf0938c80e99"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.038087 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" event={"ID":"e00e505e-4736-4aee-b340-ef223d36cf41","Type":"ContainerDied","Data":"867551228a25bbc0bc0f926b4429da94d98dd14f3c9ae4a640117cf20bb2787e"} Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.038170 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.051291 4816 scope.go:117] "RemoveContainer" containerID="422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" Mar 11 12:01:30 crc kubenswrapper[4816]: E0311 12:01:30.055545 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7\": container with ID starting with 422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7 not found: ID does not exist" containerID="422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.055613 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7"} err="failed to get container status \"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7\": rpc error: code = NotFound desc = could not find container \"422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7\": container with ID starting with 422a80d9bd7c5ac865791576cd3ec16acf27ba26a84e00b3479a440a68ffcaf7 not found: ID does not exist" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.055660 4816 scope.go:117] "RemoveContainer" containerID="3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.056768 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.060348 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-568c67664d-76gf4"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.070667 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d06617bd-ff11-42b8-9b84-e856c8c3c9eb" (UID: "d06617bd-ff11-42b8-9b84-e856c8c3c9eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.077927 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.082592 4816 scope.go:117] "RemoveContainer" containerID="278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.092672 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d755bd99c-vmbvk"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.104714 4816 scope.go:117] "RemoveContainer" containerID="380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.106022 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06617bd-ff11-42b8-9b84-e856c8c3c9eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.122001 4816 scope.go:117] "RemoveContainer" containerID="3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" Mar 11 12:01:30 crc kubenswrapper[4816]: E0311 12:01:30.122768 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a\": container with ID starting with 3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a not found: ID does not exist" containerID="3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.122819 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a"} err="failed to get container status \"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a\": rpc error: code = NotFound desc = could not find container \"3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a\": container with ID starting with 3c59f225be65141f4253e97dee4dc069f053c3b693ce3684d2ab193b505de29a not found: ID does not exist" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.122853 4816 scope.go:117] "RemoveContainer" containerID="278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a" Mar 11 12:01:30 crc kubenswrapper[4816]: E0311 12:01:30.123235 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a\": container with ID starting with 278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a not found: ID does not exist" containerID="278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.123500 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a"} err="failed to get container status \"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a\": rpc error: code = NotFound desc = could not find container \"278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a\": container with ID starting with 278bfa92b7f4cffdd11e1b735e41bf4ff1bd011a4b249bf92a3fc0359707218a not found: ID does not exist" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.123545 4816 scope.go:117] "RemoveContainer" containerID="380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757" Mar 11 12:01:30 crc kubenswrapper[4816]: E0311 12:01:30.123990 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757\": container with ID starting with 380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757 not found: ID does not exist" containerID="380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.124025 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757"} err="failed to get container status \"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757\": rpc error: code = NotFound desc = could not find container \"380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757\": container with ID starting with 380c937635389798d03c93252e524044723f38b74947be3bf57ebddd7b48d757 not found: ID does not exist" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.124047 4816 scope.go:117] "RemoveContainer" containerID="c88e59606b391072deb5c308ca0b530083322e3151190cbbfebe8fb7af0870a2" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.140987 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eabc434-3f96-4124-9afc-ecb2466f2104" path="/var/lib/kubelet/pods/0eabc434-3f96-4124-9afc-ecb2466f2104/volumes" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.141727 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00e505e-4736-4aee-b340-ef223d36cf41" path="/var/lib/kubelet/pods/e00e505e-4736-4aee-b340-ef223d36cf41/volumes" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.258147 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.375534 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.378481 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8r7jt"] Mar 11 12:01:30 crc kubenswrapper[4816]: I0311 12:01:30.483437 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:30 crc kubenswrapper[4816]: W0311 12:01:30.493940 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf862e1d6_c9a4_432c_b01f_610dac0371d6.slice/crio-b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c WatchSource:0}: Error finding container b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c: Status 404 returned error can't find the container with id b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.000042 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.044902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" event={"ID":"f862e1d6-c9a4-432c-b01f-610dac0371d6","Type":"ContainerStarted","Data":"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9"} Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.044949 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" event={"ID":"f862e1d6-c9a4-432c-b01f-610dac0371d6","Type":"ContainerStarted","Data":"b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c"} Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.045169 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.047761 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.059462 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" podStartSLOduration=3.059448494 podStartE2EDuration="3.059448494s" podCreationTimestamp="2026-03-11 12:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:31.058561198 +0000 UTC m=+177.649825165" watchObservedRunningTime="2026-03-11 12:01:31.059448494 +0000 UTC m=+177.650712461" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.121061 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.222385 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.267866 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.398055 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:31 crc kubenswrapper[4816]: I0311 12:01:31.433733 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.101260 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.102182 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.104795 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.105116 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.105115 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.105206 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.109870 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.112354 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.113089 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.116967 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130805 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130861 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130880 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.130904 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.137397 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06617bd-ff11-42b8-9b84-e856c8c3c9eb" path="/var/lib/kubelet/pods/d06617bd-ff11-42b8-9b84-e856c8c3c9eb/volumes" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232095 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232133 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232159 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232175 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.232201 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.233788 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.234266 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.234349 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.243010 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.251846 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") pod \"controller-manager-67c5474778-rwg6j\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.437119 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.629901 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.972098 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:01:32 crc kubenswrapper[4816]: I0311 12:01:32.972169 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.015754 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.063786 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" event={"ID":"67d6304f-5acd-48ff-9d06-b221c14f80fc","Type":"ContainerStarted","Data":"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56"} Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.063857 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" event={"ID":"67d6304f-5acd-48ff-9d06-b221c14f80fc","Type":"ContainerStarted","Data":"e62b23734f4605f0a4a63279799bd424ddf3142a70edb89c159c912c5c2f76f1"} Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.082742 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" podStartSLOduration=5.082722316 podStartE2EDuration="5.082722316s" podCreationTimestamp="2026-03-11 12:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:33.080180354 +0000 UTC m=+179.671444331" watchObservedRunningTime="2026-03-11 12:01:33.082722316 +0000 UTC m=+179.673986283" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.111206 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.225996 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.226193 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2dh2" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" containerID="cri-o://5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" gracePeriod=2 Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.547362 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.649356 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") pod \"756dd25b-5375-48bc-8578-a9585ef49e6c\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.649445 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") pod \"756dd25b-5375-48bc-8578-a9585ef49e6c\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.649508 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") pod \"756dd25b-5375-48bc-8578-a9585ef49e6c\" (UID: \"756dd25b-5375-48bc-8578-a9585ef49e6c\") " Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.650480 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities" (OuterVolumeSpecName: "utilities") pod "756dd25b-5375-48bc-8578-a9585ef49e6c" (UID: "756dd25b-5375-48bc-8578-a9585ef49e6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.666416 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz" (OuterVolumeSpecName: "kube-api-access-vsxwz") pod "756dd25b-5375-48bc-8578-a9585ef49e6c" (UID: "756dd25b-5375-48bc-8578-a9585ef49e6c"). InnerVolumeSpecName "kube-api-access-vsxwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.716069 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "756dd25b-5375-48bc-8578-a9585ef49e6c" (UID: "756dd25b-5375-48bc-8578-a9585ef49e6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.760406 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.760445 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsxwz\" (UniqueName: \"kubernetes.io/projected/756dd25b-5375-48bc-8578-a9585ef49e6c-kube-api-access-vsxwz\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:33 crc kubenswrapper[4816]: I0311 12:01:33.760456 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/756dd25b-5375-48bc-8578-a9585ef49e6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.070836 4816 generic.go:334] "Generic (PLEG): container finished" podID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerID="5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" exitCode=0 Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.072150 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2dh2" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.074261 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerDied","Data":"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc"} Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.074329 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.074346 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2dh2" event={"ID":"756dd25b-5375-48bc-8578-a9585ef49e6c","Type":"ContainerDied","Data":"6c14169c95d372913c35e21942a8624231ab375807c0764494279b189e642cec"} Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.074372 4816 scope.go:117] "RemoveContainer" containerID="5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.077660 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.100040 4816 scope.go:117] "RemoveContainer" containerID="3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.134727 4816 scope.go:117] "RemoveContainer" containerID="c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.140626 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.140787 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2dh2"] Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.152605 4816 scope.go:117] "RemoveContainer" containerID="5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" Mar 11 12:01:34 crc kubenswrapper[4816]: E0311 12:01:34.153039 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc\": container with ID starting with 5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc not found: ID does not exist" containerID="5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.153198 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc"} err="failed to get container status \"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc\": rpc error: code = NotFound desc = could not find container \"5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc\": container with ID starting with 5466ae20f25fa3a1f58397452a631aeecdb16aff724bdf573d03a510178e71fc not found: ID does not exist" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.153380 4816 scope.go:117] "RemoveContainer" containerID="3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca" Mar 11 12:01:34 crc kubenswrapper[4816]: E0311 12:01:34.153839 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca\": container with ID starting with 3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca not found: ID does not exist" containerID="3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.153892 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca"} err="failed to get container status \"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca\": rpc error: code = NotFound desc = could not find container \"3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca\": container with ID starting with 3da638aaf6eed9072cff54e60fb89177e6f65f1797105a333a9b3c78228673ca not found: ID does not exist" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.153927 4816 scope.go:117] "RemoveContainer" containerID="c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac" Mar 11 12:01:34 crc kubenswrapper[4816]: E0311 12:01:34.154222 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac\": container with ID starting with c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac not found: ID does not exist" containerID="c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac" Mar 11 12:01:34 crc kubenswrapper[4816]: I0311 12:01:34.154253 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac"} err="failed to get container status \"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac\": rpc error: code = NotFound desc = could not find container \"c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac\": container with ID starting with c124465b3e74b2d23eadfb691622e8fedf745d36a173b247bc81030f4e6053ac not found: ID does not exist" Mar 11 12:01:36 crc kubenswrapper[4816]: I0311 12:01:36.137082 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" path="/var/lib/kubelet/pods/756dd25b-5375-48bc-8578-a9585ef49e6c/volumes" Mar 11 12:01:40 crc kubenswrapper[4816]: I0311 12:01:40.975531 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" containerID="cri-o://e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc" gracePeriod=15 Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.112649 4816 generic.go:334] "Generic (PLEG): container finished" podID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerID="e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc" exitCode=0 Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.112696 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" event={"ID":"f0f288b8-4b39-42ac-9835-4fb118a86218","Type":"ContainerDied","Data":"e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc"} Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.404270 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.459829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460197 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460236 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460309 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460342 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460370 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460416 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460449 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460481 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460533 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460575 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460606 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.460644 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") pod \"f0f288b8-4b39-42ac-9835-4fb118a86218\" (UID: \"f0f288b8-4b39-42ac-9835-4fb118a86218\") " Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.461591 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.462437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.462500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.465466 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.465478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.467070 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.467559 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.467921 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.468161 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.468711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.468805 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.469399 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.470400 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c" (OuterVolumeSpecName: "kube-api-access-csr9c") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "kube-api-access-csr9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.473215 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f0f288b8-4b39-42ac-9835-4fb118a86218" (UID: "f0f288b8-4b39-42ac-9835-4fb118a86218"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.562605 4816 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.562981 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563076 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563194 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563356 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563455 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563545 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563635 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563724 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563811 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563896 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.563974 4816 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f0f288b8-4b39-42ac-9835-4fb118a86218-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.564061 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csr9c\" (UniqueName: \"kubernetes.io/projected/f0f288b8-4b39-42ac-9835-4fb118a86218-kube-api-access-csr9c\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:41 crc kubenswrapper[4816]: I0311 12:01:41.564141 4816 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f0f288b8-4b39-42ac-9835-4fb118a86218-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.117959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" event={"ID":"f0f288b8-4b39-42ac-9835-4fb118a86218","Type":"ContainerDied","Data":"b0c7d7e2fb960d418e680c2e934ecd5f41d42c08329bcf33576579240438a243"} Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.118015 4816 scope.go:117] "RemoveContainer" containerID="e6147c8cc58ce3bae6f999bb1c2d0007faaa3cf350373703380a98dc3aa752bc" Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.118013 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bz2pp" Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.149219 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:01:42 crc kubenswrapper[4816]: I0311 12:01:42.153078 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bz2pp"] Mar 11 12:01:44 crc kubenswrapper[4816]: I0311 12:01:44.136533 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" path="/var/lib/kubelet/pods/f0f288b8-4b39-42ac-9835-4fb118a86218/volumes" Mar 11 12:01:48 crc kubenswrapper[4816]: I0311 12:01:48.502807 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:48 crc kubenswrapper[4816]: I0311 12:01:48.504059 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerName="controller-manager" containerID="cri-o://64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" gracePeriod=30 Mar 11 12:01:48 crc kubenswrapper[4816]: I0311 12:01:48.599947 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:48 crc kubenswrapper[4816]: I0311 12:01:48.600453 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerName="route-controller-manager" containerID="cri-o://1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" gracePeriod=30 Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.024511 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.048679 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") pod \"f862e1d6-c9a4-432c-b01f-610dac0371d6\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.048759 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") pod \"f862e1d6-c9a4-432c-b01f-610dac0371d6\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.048786 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") pod \"f862e1d6-c9a4-432c-b01f-610dac0371d6\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.048818 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") pod \"f862e1d6-c9a4-432c-b01f-610dac0371d6\" (UID: \"f862e1d6-c9a4-432c-b01f-610dac0371d6\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.050465 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca" (OuterVolumeSpecName: "client-ca") pod "f862e1d6-c9a4-432c-b01f-610dac0371d6" (UID: "f862e1d6-c9a4-432c-b01f-610dac0371d6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.052433 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config" (OuterVolumeSpecName: "config") pod "f862e1d6-c9a4-432c-b01f-610dac0371d6" (UID: "f862e1d6-c9a4-432c-b01f-610dac0371d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.054281 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f862e1d6-c9a4-432c-b01f-610dac0371d6" (UID: "f862e1d6-c9a4-432c-b01f-610dac0371d6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.054567 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx" (OuterVolumeSpecName: "kube-api-access-9rxvx") pod "f862e1d6-c9a4-432c-b01f-610dac0371d6" (UID: "f862e1d6-c9a4-432c-b01f-610dac0371d6"). InnerVolumeSpecName "kube-api-access-9rxvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.087972 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.114315 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl"] Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.114715 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.114780 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.114869 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerName="route-controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.114932 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerName="route-controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.115003 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.115071 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.115130 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerName="controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.115213 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerName="controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.115291 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="extract-content" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116007 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="extract-content" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.116074 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="extract-utilities" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="extract-utilities" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116331 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerName="controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116451 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f288b8-4b39-42ac-9835-4fb118a86218" containerName="oauth-openshift" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116524 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerName="route-controller-manager" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.116587 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="756dd25b-5375-48bc-8578-a9585ef49e6c" containerName="registry-server" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.117200 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.120838 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121049 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121163 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121370 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121490 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.121588 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.123589 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.126808 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.127138 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.127440 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.127594 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.128533 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.129887 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.132099 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.134859 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.136932 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150091 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150171 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150233 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150306 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150370 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") pod \"67d6304f-5acd-48ff-9d06-b221c14f80fc\" (UID: \"67d6304f-5acd-48ff-9d06-b221c14f80fc\") " Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150518 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150536 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-login\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150568 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-session\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150596 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-error\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150615 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2z59\" (UniqueName: \"kubernetes.io/projected/ea112c1f-2bbd-48bb-979e-980a6486f185-kube-api-access-c2z59\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150659 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-router-certs\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150675 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-dir\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150692 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-service-ca\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150739 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150759 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150779 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150797 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-policies\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150832 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f862e1d6-c9a4-432c-b01f-610dac0371d6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150843 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150852 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f862e1d6-c9a4-432c-b01f-610dac0371d6-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.150913 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rxvx\" (UniqueName: \"kubernetes.io/projected/f862e1d6-c9a4-432c-b01f-610dac0371d6-kube-api-access-9rxvx\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.151546 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.151568 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.151666 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config" (OuterVolumeSpecName: "config") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.153318 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.153366 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt" (OuterVolumeSpecName: "kube-api-access-cxdtt") pod "67d6304f-5acd-48ff-9d06-b221c14f80fc" (UID: "67d6304f-5acd-48ff-9d06-b221c14f80fc"). InnerVolumeSpecName "kube-api-access-cxdtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160127 4816 generic.go:334] "Generic (PLEG): container finished" podID="67d6304f-5acd-48ff-9d06-b221c14f80fc" containerID="64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" exitCode=0 Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160188 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" event={"ID":"67d6304f-5acd-48ff-9d06-b221c14f80fc","Type":"ContainerDied","Data":"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56"} Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" event={"ID":"67d6304f-5acd-48ff-9d06-b221c14f80fc","Type":"ContainerDied","Data":"e62b23734f4605f0a4a63279799bd424ddf3142a70edb89c159c912c5c2f76f1"} Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160235 4816 scope.go:117] "RemoveContainer" containerID="64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.160361 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c5474778-rwg6j" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.164805 4816 generic.go:334] "Generic (PLEG): container finished" podID="f862e1d6-c9a4-432c-b01f-610dac0371d6" containerID="1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" exitCode=0 Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.164842 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" event={"ID":"f862e1d6-c9a4-432c-b01f-610dac0371d6","Type":"ContainerDied","Data":"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9"} Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.164864 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" event={"ID":"f862e1d6-c9a4-432c-b01f-610dac0371d6","Type":"ContainerDied","Data":"b5aa380dfae5e5fb091865b30d4c19f63f255e625fdd2414ee2e29d633336c4c"} Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.164907 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.187275 4816 scope.go:117] "RemoveContainer" containerID="64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.187724 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56\": container with ID starting with 64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56 not found: ID does not exist" containerID="64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.187771 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56"} err="failed to get container status \"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56\": rpc error: code = NotFound desc = could not find container \"64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56\": container with ID starting with 64618ab3206e35f87bc9a73a3adb11b24a0d0a08e8a4aa35de128b5b11632d56 not found: ID does not exist" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.187798 4816 scope.go:117] "RemoveContainer" containerID="1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.188922 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.197312 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67c5474778-rwg6j"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.201151 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.203807 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74bdd87f68-49lvw"] Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.206380 4816 scope.go:117] "RemoveContainer" containerID="1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" Mar 11 12:01:49 crc kubenswrapper[4816]: E0311 12:01:49.206855 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9\": container with ID starting with 1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9 not found: ID does not exist" containerID="1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.206888 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9"} err="failed to get container status \"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9\": rpc error: code = NotFound desc = could not find container \"1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9\": container with ID starting with 1113e474910e2bfea68438fabb0045c35bed39208a4635e0633ad3f3178ea6a9 not found: ID does not exist" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251565 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-router-certs\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251611 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-dir\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251633 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251662 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-service-ca\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251693 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251714 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251733 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251750 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-policies\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251766 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251783 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251800 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-login\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251828 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-session\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251858 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-error\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251875 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2z59\" (UniqueName: \"kubernetes.io/projected/ea112c1f-2bbd-48bb-979e-980a6486f185-kube-api-access-c2z59\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251908 4816 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251920 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251929 4816 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67d6304f-5acd-48ff-9d06-b221c14f80fc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251938 4816 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d6304f-5acd-48ff-9d06-b221c14f80fc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.251948 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxdtt\" (UniqueName: \"kubernetes.io/projected/67d6304f-5acd-48ff-9d06-b221c14f80fc-kube-api-access-cxdtt\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.252931 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-dir\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.253731 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-service-ca\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.253875 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.253976 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-audit-policies\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.253986 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257112 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-error\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257235 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-session\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257363 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257369 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257403 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-user-template-login\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257617 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-router-certs\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.257757 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.258104 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea112c1f-2bbd-48bb-979e-980a6486f185-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.267794 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2z59\" (UniqueName: \"kubernetes.io/projected/ea112c1f-2bbd-48bb-979e-980a6486f185-kube-api-access-c2z59\") pod \"oauth-openshift-77d5dbcfdf-rvkbl\" (UID: \"ea112c1f-2bbd-48bb-979e-980a6486f185\") " pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.441054 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:49 crc kubenswrapper[4816]: I0311 12:01:49.812601 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl"] Mar 11 12:01:49 crc kubenswrapper[4816]: W0311 12:01:49.818019 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea112c1f_2bbd_48bb_979e_980a6486f185.slice/crio-b4e06439659818c228d6a9bff54daaf6207e7a57e6f0d975ac8144ca723669c6 WatchSource:0}: Error finding container b4e06439659818c228d6a9bff54daaf6207e7a57e6f0d975ac8144ca723669c6: Status 404 returned error can't find the container with id b4e06439659818c228d6a9bff54daaf6207e7a57e6f0d975ac8144ca723669c6 Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.126565 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-585b8644c9-vg9hh"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.128302 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.130889 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.131380 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.133360 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.133652 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.134373 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.134607 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.145842 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.146764 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d6304f-5acd-48ff-9d06-b221c14f80fc" path="/var/lib/kubelet/pods/67d6304f-5acd-48ff-9d06-b221c14f80fc/volumes" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.147578 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f862e1d6-c9a4-432c-b01f-610dac0371d6" path="/var/lib/kubelet/pods/f862e1d6-c9a4-432c-b01f-610dac0371d6/volumes" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.148873 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.149655 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585b8644c9-vg9hh"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.149683 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.149761 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.152301 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.152322 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.152819 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.153415 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.153516 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.153421 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163431 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-config\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163468 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-client-ca\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163525 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-proxy-ca-bundles\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163553 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgjq\" (UniqueName: \"kubernetes.io/projected/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-kube-api-access-hhgjq\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.163572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-serving-cert\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.170903 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" event={"ID":"ea112c1f-2bbd-48bb-979e-980a6486f185","Type":"ContainerStarted","Data":"47232ae5d93011aab131314d06cce85fb43c74c36f99bff53e4b62955cbb1144"} Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.170943 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" event={"ID":"ea112c1f-2bbd-48bb-979e-980a6486f185","Type":"ContainerStarted","Data":"b4e06439659818c228d6a9bff54daaf6207e7a57e6f0d975ac8144ca723669c6"} Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.171080 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.188802 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" podStartSLOduration=35.188780354 podStartE2EDuration="35.188780354s" podCreationTimestamp="2026-03-11 12:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:50.187652962 +0000 UTC m=+196.778916929" watchObservedRunningTime="2026-03-11 12:01:50.188780354 +0000 UTC m=+196.780044331" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264615 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-serving-cert\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8rg2\" (UniqueName: \"kubernetes.io/projected/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-kube-api-access-s8rg2\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264805 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-config\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-client-ca\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264928 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-proxy-ca-bundles\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.264988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgjq\" (UniqueName: \"kubernetes.io/projected/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-kube-api-access-hhgjq\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.265012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-config\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.265032 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-serving-cert\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.265053 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-client-ca\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.266413 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-proxy-ca-bundles\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.266966 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-client-ca\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.267168 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-config\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.276094 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-serving-cert\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.282720 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgjq\" (UniqueName: \"kubernetes.io/projected/5e386f67-a816-4b53-b39a-5db0f6dfbc2a-kube-api-access-hhgjq\") pod \"controller-manager-585b8644c9-vg9hh\" (UID: \"5e386f67-a816-4b53-b39a-5db0f6dfbc2a\") " pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.366614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-config\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.366667 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-client-ca\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.366727 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-serving-cert\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.366767 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8rg2\" (UniqueName: \"kubernetes.io/projected/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-kube-api-access-s8rg2\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.367984 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-client-ca\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.368066 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-config\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.369653 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-serving-cert\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.381674 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8rg2\" (UniqueName: \"kubernetes.io/projected/5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b-kube-api-access-s8rg2\") pod \"route-controller-manager-777f98b8fc-ds5f7\" (UID: \"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b\") " pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.441918 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.467867 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.560018 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-77d5dbcfdf-rvkbl" Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.833856 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585b8644c9-vg9hh"] Mar 11 12:01:50 crc kubenswrapper[4816]: I0311 12:01:50.934673 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7"] Mar 11 12:01:50 crc kubenswrapper[4816]: W0311 12:01:50.940789 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be5607d_c6a3_4ccd_9e3f_99c57bc38d7b.slice/crio-17ec4d81d1c2fe5b14d8c9342e15ad31dceb380c8d8f58ebd7ba64bc4c6d70d2 WatchSource:0}: Error finding container 17ec4d81d1c2fe5b14d8c9342e15ad31dceb380c8d8f58ebd7ba64bc4c6d70d2: Status 404 returned error can't find the container with id 17ec4d81d1c2fe5b14d8c9342e15ad31dceb380c8d8f58ebd7ba64bc4c6d70d2 Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.180357 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" event={"ID":"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b","Type":"ContainerStarted","Data":"7ded85d5e7523ff25bd756f7359889fd4e2bbcc50c9aa1df594c8d47c53c49fa"} Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.180651 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.180671 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" event={"ID":"5be5607d-c6a3-4ccd-9e3f-99c57bc38d7b","Type":"ContainerStarted","Data":"17ec4d81d1c2fe5b14d8c9342e15ad31dceb380c8d8f58ebd7ba64bc4c6d70d2"} Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.181699 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" event={"ID":"5e386f67-a816-4b53-b39a-5db0f6dfbc2a","Type":"ContainerStarted","Data":"05be18498e3eb84ca9646678a7178cc7af3d42649e2ab55a6755f06ad29010c3"} Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.181736 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" event={"ID":"5e386f67-a816-4b53-b39a-5db0f6dfbc2a","Type":"ContainerStarted","Data":"963822b7511d0122298da214e4a8ce92cb7fa4e421d6c14b5e157cabb6d5d894"} Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.181999 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.190009 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.197903 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" podStartSLOduration=3.197888427 podStartE2EDuration="3.197888427s" podCreationTimestamp="2026-03-11 12:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:51.194403308 +0000 UTC m=+197.785667265" watchObservedRunningTime="2026-03-11 12:01:51.197888427 +0000 UTC m=+197.789152394" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.220120 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-585b8644c9-vg9hh" podStartSLOduration=3.220103933 podStartE2EDuration="3.220103933s" podCreationTimestamp="2026-03-11 12:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:01:51.219448244 +0000 UTC m=+197.810712201" watchObservedRunningTime="2026-03-11 12:01:51.220103933 +0000 UTC m=+197.811367900" Mar 11 12:01:51 crc kubenswrapper[4816]: I0311 12:01:51.443394 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-777f98b8fc-ds5f7" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.936223 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937222 4816 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937355 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937510 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937536 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937590 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://964c09610d05fa085a4adc7f7d902f67376a9168848e403cd849cfc2290dc26d" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937590 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.937604 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3" gracePeriod=15 Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938384 4816 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938601 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938617 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938625 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938633 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938641 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938648 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938658 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938663 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938670 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938677 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938692 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938698 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938712 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938717 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938727 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938733 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938835 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938848 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938859 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938869 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938876 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938884 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938891 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938979 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938986 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.938993 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.938999 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.939124 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: I0311 12:01:52.939135 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 12:01:52 crc kubenswrapper[4816]: E0311 12:01:52.980113 4816 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.94:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005673 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005720 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005752 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005777 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005798 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005816 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005839 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.005859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107526 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107607 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107641 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107653 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107696 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107714 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107849 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107868 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107895 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107914 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107969 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.107986 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.108016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.108038 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.108056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.192727 4816 generic.go:334] "Generic (PLEG): container finished" podID="106a80c4-7132-43b4-930f-bd886787437f" containerID="13cc1621a3a1352dc36083505ef9245a833ca0fab13f1b74079c751c4ed90659" exitCode=0 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.192789 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"106a80c4-7132-43b4-930f-bd886787437f","Type":"ContainerDied","Data":"13cc1621a3a1352dc36083505ef9245a833ca0fab13f1b74079c751c4ed90659"} Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.193214 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.193457 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.196535 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198152 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198791 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="964c09610d05fa085a4adc7f7d902f67376a9168848e403cd849cfc2290dc26d" exitCode=0 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198809 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56" exitCode=0 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198820 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594" exitCode=0 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.198827 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3" exitCode=2 Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.199399 4816 scope.go:117] "RemoveContainer" containerID="eea025ef475dfe78f639596b3e2942a59ebc06877f9d2ac553ad7b41daa98dbd" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.280908 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:53 crc kubenswrapper[4816]: W0311 12:01:53.312752 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-90fc4142aaecf9c3b788b3691ff3cbfa63e2833d66c3cb5a40fe3a191416e240 WatchSource:0}: Error finding container 90fc4142aaecf9c3b788b3691ff3cbfa63e2833d66c3cb5a40fe3a191416e240: Status 404 returned error can't find the container with id 90fc4142aaecf9c3b788b3691ff3cbfa63e2833d66c3cb5a40fe3a191416e240 Mar 11 12:01:53 crc kubenswrapper[4816]: E0311 12:01:53.319007 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.94:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189bc7c1c32a9eef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 12:01:53.317388015 +0000 UTC m=+199.908651982,LastTimestamp:2026-03-11 12:01:53.317388015 +0000 UTC m=+199.908651982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.588513 4816 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 11 12:01:53 crc kubenswrapper[4816]: I0311 12:01:53.588594 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.132418 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.132745 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.207756 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a"} Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.207843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"90fc4142aaecf9c3b788b3691ff3cbfa63e2833d66c3cb5a40fe3a191416e240"} Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.210107 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: E0311 12:01:54.210338 4816 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.94:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.212960 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.527754 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.528636 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632235 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") pod \"106a80c4-7132-43b4-930f-bd886787437f\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632326 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") pod \"106a80c4-7132-43b4-930f-bd886787437f\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632352 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") pod \"106a80c4-7132-43b4-930f-bd886787437f\" (UID: \"106a80c4-7132-43b4-930f-bd886787437f\") " Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632351 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock" (OuterVolumeSpecName: "var-lock") pod "106a80c4-7132-43b4-930f-bd886787437f" (UID: "106a80c4-7132-43b4-930f-bd886787437f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632479 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "106a80c4-7132-43b4-930f-bd886787437f" (UID: "106a80c4-7132-43b4-930f-bd886787437f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632664 4816 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.632684 4816 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/106a80c4-7132-43b4-930f-bd886787437f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.636899 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "106a80c4-7132-43b4-930f-bd886787437f" (UID: "106a80c4-7132-43b4-930f-bd886787437f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:01:54 crc kubenswrapper[4816]: I0311 12:01:54.734285 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/106a80c4-7132-43b4-930f-bd886787437f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.222359 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.222345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"106a80c4-7132-43b4-930f-bd886787437f","Type":"ContainerDied","Data":"b87f445ca27d573faee92ddd515c624b2b710e714f620c36718ab43fc1a2134f"} Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.222778 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b87f445ca27d573faee92ddd515c624b2b710e714f620c36718ab43fc1a2134f" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.225293 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.225961 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46" exitCode=0 Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.237652 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.709649 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.710582 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.711431 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.711929 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748197 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748577 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748666 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.748672 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.749021 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.749045 4816 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.749290 4816 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.798422 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.798777 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.799133 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.799497 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.799867 4816 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.800195 4816 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 11 12:01:55 crc kubenswrapper[4816]: E0311 12:01:55.801391 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="200ms" Mar 11 12:01:55 crc kubenswrapper[4816]: I0311 12:01:55.850601 4816 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:01:56 crc kubenswrapper[4816]: E0311 12:01:56.001732 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="400ms" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.138822 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.235800 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.236415 4816 scope.go:117] "RemoveContainer" containerID="964c09610d05fa085a4adc7f7d902f67376a9168848e403cd849cfc2290dc26d" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.236556 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.237311 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.237513 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.240697 4816 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.241018 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.256573 4816 scope.go:117] "RemoveContainer" containerID="f04cdf2254cd3d070567bec1a9b10d6ffff3f5da5056b637b7d006f4ded72e56" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.268122 4816 scope.go:117] "RemoveContainer" containerID="c5e6ee0da068d98d88f55efae8cb0cb12fe57c85e11f5638daaa5e0f8a1f8594" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.284148 4816 scope.go:117] "RemoveContainer" containerID="6180f737a5d60df3a71764fb2eaca26d3b25306cd8653d66d0b7fab4ec7debe3" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.297178 4816 scope.go:117] "RemoveContainer" containerID="c2fca9f57b03035a1290e3686e7b98d15f9151ad5f5b811112ad882b47cb9e46" Mar 11 12:01:56 crc kubenswrapper[4816]: I0311 12:01:56.312320 4816 scope.go:117] "RemoveContainer" containerID="789a3fa60b21759f42c2997678010f994718ce5057a3a059491bc930652d3e38" Mar 11 12:01:56 crc kubenswrapper[4816]: E0311 12:01:56.403203 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="800ms" Mar 11 12:01:57 crc kubenswrapper[4816]: E0311 12:01:57.203951 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="1.6s" Mar 11 12:01:57 crc kubenswrapper[4816]: E0311 12:01:57.595273 4816 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.94:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189bc7c1c32a9eef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 12:01:53.317388015 +0000 UTC m=+199.908651982,LastTimestamp:2026-03-11 12:01:53.317388015 +0000 UTC m=+199.908651982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 12:01:58 crc kubenswrapper[4816]: E0311 12:01:58.805441 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="3.2s" Mar 11 12:02:02 crc kubenswrapper[4816]: E0311 12:02:02.006236 4816 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.94:6443: connect: connection refused" interval="6.4s" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.130024 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.136909 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.138270 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.149920 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.149951 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:04 crc kubenswrapper[4816]: E0311 12:02:04.150357 4816 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.151092 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:04 crc kubenswrapper[4816]: W0311 12:02:04.172491 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-4f610d8ade4556e966386e3da3d1fe7e8b8c2ec47fc3a13b82fdf9f538c3eda0 WatchSource:0}: Error finding container 4f610d8ade4556e966386e3da3d1fe7e8b8c2ec47fc3a13b82fdf9f538c3eda0: Status 404 returned error can't find the container with id 4f610d8ade4556e966386e3da3d1fe7e8b8c2ec47fc3a13b82fdf9f538c3eda0 Mar 11 12:02:04 crc kubenswrapper[4816]: I0311 12:02:04.289140 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f610d8ade4556e966386e3da3d1fe7e8b8c2ec47fc3a13b82fdf9f538c3eda0"} Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.296763 4816 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bf0572f97888022c815137b73179893aed925bed6d8fd477a66e6a0e36c3abd2" exitCode=0 Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.296839 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bf0572f97888022c815137b73179893aed925bed6d8fd477a66e6a0e36c3abd2"} Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.297199 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.297232 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:05 crc kubenswrapper[4816]: E0311 12:02:05.297805 4816 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:05 crc kubenswrapper[4816]: I0311 12:02:05.297924 4816 status_manager.go:851] "Failed to get status for pod" podUID="106a80c4-7132-43b4-930f-bd886787437f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.94:6443: connect: connection refused" Mar 11 12:02:06 crc kubenswrapper[4816]: I0311 12:02:06.306370 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"457ece82db84f0b7f8d8eeffb7fa8a017dfd31e1afd92d2ceceab272dc7da47f"} Mar 11 12:02:06 crc kubenswrapper[4816]: I0311 12:02:06.306668 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2422d8f7d6c84293d4f1c6c383181086d1b57d6cc94e90b0c097eb1178ab65a4"} Mar 11 12:02:06 crc kubenswrapper[4816]: I0311 12:02:06.306679 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b64874309994a7a7ce05a7c55bd76eb410d35646b17031929be8e9fff6fc32cd"} Mar 11 12:02:06 crc kubenswrapper[4816]: I0311 12:02:06.306687 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8cb5937837be06bc5590391cc325c6ef33b8a084057540c4b60a6891879630c5"} Mar 11 12:02:07 crc kubenswrapper[4816]: I0311 12:02:07.319294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"27b8cff796af4e02c51a8468aa35452cb9e89bae3e503ba9531778f54c82e163"} Mar 11 12:02:07 crc kubenswrapper[4816]: I0311 12:02:07.319519 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:07 crc kubenswrapper[4816]: I0311 12:02:07.319731 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:07 crc kubenswrapper[4816]: I0311 12:02:07.319767 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.328869 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.330020 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.330096 4816 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4cc365d25b754728795b200a155fa9bd64393ac8ec89f832fb06fc0f17e72cb5" exitCode=1 Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.330146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4cc365d25b754728795b200a155fa9bd64393ac8ec89f832fb06fc0f17e72cb5"} Mar 11 12:02:08 crc kubenswrapper[4816]: I0311 12:02:08.330851 4816 scope.go:117] "RemoveContainer" containerID="4cc365d25b754728795b200a155fa9bd64393ac8ec89f832fb06fc0f17e72cb5" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.151304 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.151376 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.157319 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.341635 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.342375 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.342458 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b392c9212fce86344701d69479c22395c3f78b0384e4f9171f781a1c50cb91f4"} Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.515562 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:02:09 crc kubenswrapper[4816]: I0311 12:02:09.516063 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:02:12 crc kubenswrapper[4816]: I0311 12:02:12.330339 4816 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:12 crc kubenswrapper[4816]: I0311 12:02:12.359739 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:12 crc kubenswrapper[4816]: I0311 12:02:12.359773 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:12 crc kubenswrapper[4816]: I0311 12:02:12.363376 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:13 crc kubenswrapper[4816]: I0311 12:02:13.366130 4816 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:13 crc kubenswrapper[4816]: I0311 12:02:13.366538 4816 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4a8570a1-8304-4344-ac73-7346c594a222" Mar 11 12:02:14 crc kubenswrapper[4816]: I0311 12:02:14.144066 4816 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e5be8a02-8a6a-405e-b631-53e3c73dbc0f" Mar 11 12:02:14 crc kubenswrapper[4816]: I0311 12:02:14.462548 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 12:02:14 crc kubenswrapper[4816]: I0311 12:02:14.463184 4816 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 11 12:02:14 crc kubenswrapper[4816]: I0311 12:02:14.463229 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 11 12:02:17 crc kubenswrapper[4816]: I0311 12:02:17.142524 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 12:02:18 crc kubenswrapper[4816]: I0311 12:02:18.771615 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 12:02:19 crc kubenswrapper[4816]: I0311 12:02:19.479478 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 12:02:22 crc kubenswrapper[4816]: I0311 12:02:22.129580 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 12:02:22 crc kubenswrapper[4816]: I0311 12:02:22.729128 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 12:02:22 crc kubenswrapper[4816]: I0311 12:02:22.916524 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.179000 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.273115 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.420669 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.428803 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.722794 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.871498 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 12:02:23 crc kubenswrapper[4816]: I0311 12:02:23.940523 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.136627 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.198938 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.252720 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.395327 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.447710 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.465449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.471141 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.488944 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.799083 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.817791 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 12:02:24 crc kubenswrapper[4816]: I0311 12:02:24.956140 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.033385 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.045005 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.101545 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.127315 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.284051 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.446685 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.477010 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.821717 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 12:02:25 crc kubenswrapper[4816]: I0311 12:02:25.856471 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.162718 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.262274 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.329695 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.385556 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.630209 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.638825 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.695318 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.727845 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.739988 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.873962 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.905323 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.914170 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 12:02:26 crc kubenswrapper[4816]: I0311 12:02:26.985863 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.126085 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.138200 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.210365 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.280482 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.409211 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.573608 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.582319 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.603221 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 12:02:27 crc kubenswrapper[4816]: I0311 12:02:27.912624 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.029495 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.139200 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.155518 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.164740 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.174830 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.250510 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.277776 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.354703 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.506227 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.674024 4816 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.674933 4816 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.708236 4816 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.744548 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.776982 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.845446 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.880073 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 12:02:28 crc kubenswrapper[4816]: I0311 12:02:28.936542 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.013437 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.125066 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.244061 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.273997 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.321877 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.365420 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.373923 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.441751 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.491715 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.581947 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.582039 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.656124 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.668192 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.682909 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.709894 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.762988 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.767739 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.867969 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.899194 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.952811 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.968076 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 12:02:29 crc kubenswrapper[4816]: I0311 12:02:29.976393 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.079144 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.094933 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.100744 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.300716 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.346822 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.383617 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.468164 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.593459 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.593488 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.703853 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.728714 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.766588 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.772204 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.858044 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.901120 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.955418 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.985383 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 12:02:30 crc kubenswrapper[4816]: I0311 12:02:30.999885 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.022217 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.058493 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.082651 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.089767 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.128215 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.259585 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.283025 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.292281 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.293239 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.335496 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.397268 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.415980 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.429183 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.480139 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.646674 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.674919 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.677052 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.699926 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.771911 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.842857 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.854822 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 12:02:31 crc kubenswrapper[4816]: I0311 12:02:31.935810 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.014972 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.017021 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.067906 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.187478 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.191941 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.194410 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.204732 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.262884 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.421592 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.471454 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.497978 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.522978 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.596392 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.601923 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.612554 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.646788 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.648945 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.683673 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.738927 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.796843 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 12:02:32 crc kubenswrapper[4816]: I0311 12:02:32.851301 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.181429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.215998 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.219529 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.254431 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.283503 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.402888 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.425324 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.447888 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.474049 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.560337 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.628127 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.647045 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.681961 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.716215 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.775700 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.853585 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.863049 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.885622 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 12:02:33 crc kubenswrapper[4816]: I0311 12:02:33.927985 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.090646 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.094538 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.161727 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.167678 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.295959 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.333837 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.337614 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.367989 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.431594 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.441176 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.441348 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.466506 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.469995 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.692823 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.727774 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.794264 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.828581 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.847934 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.877127 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.902997 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.926098 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.977512 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 12:02:34 crc kubenswrapper[4816]: I0311 12:02:34.978533 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.203747 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.208798 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.225074 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.256645 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.323086 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.418168 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.706609 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.747501 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.777148 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.780111 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.850861 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.865525 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 12:02:35 crc kubenswrapper[4816]: I0311 12:02:35.920664 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.033798 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.038901 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.098147 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.228438 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.278696 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.332001 4816 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.344831 4816 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.526284 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.597079 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.619937 4816 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.624444 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.624501 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.628391 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.645129 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.645113139 podStartE2EDuration="24.645113139s" podCreationTimestamp="2026-03-11 12:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:02:36.644752129 +0000 UTC m=+243.236016096" watchObservedRunningTime="2026-03-11 12:02:36.645113139 +0000 UTC m=+243.236377106" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.856606 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.868305 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.923005 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.960324 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 12:02:36 crc kubenswrapper[4816]: I0311 12:02:36.965838 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.051925 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.123427 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.137946 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.154501 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.230140 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.300287 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.308340 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.457590 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.512202 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.696070 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.705665 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.745816 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.777043 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.791177 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.862976 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 12:02:37 crc kubenswrapper[4816]: I0311 12:02:37.870088 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.329147 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.352889 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.407767 4816 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.449843 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 12:02:38 crc kubenswrapper[4816]: I0311 12:02:38.688454 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.172028 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.199184 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.222072 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.325760 4816 ???:1] "http: TLS handshake error from 192.168.126.11:59192: no serving certificate available for the kubelet" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.514919 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.514984 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.533820 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:02:39 crc kubenswrapper[4816]: E0311 12:02:39.534050 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106a80c4-7132-43b4-930f-bd886787437f" containerName="installer" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.534065 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="106a80c4-7132-43b4-930f-bd886787437f" containerName="installer" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.534183 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="106a80c4-7132-43b4-930f-bd886787437f" containerName="installer" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.534552 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.536588 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.536626 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.537320 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.543130 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.647447 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") pod \"auto-csr-approver-29553842-6xkxh\" (UID: \"ba5c6602-69d6-46be-a23b-fb4d6290a974\") " pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.748657 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") pod \"auto-csr-approver-29553842-6xkxh\" (UID: \"ba5c6602-69d6-46be-a23b-fb4d6290a974\") " pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.766481 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") pod \"auto-csr-approver-29553842-6xkxh\" (UID: \"ba5c6602-69d6-46be-a23b-fb4d6290a974\") " pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.840100 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 12:02:39 crc kubenswrapper[4816]: I0311 12:02:39.849507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.220397 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.245854 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.250380 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.378762 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 12:02:40 crc kubenswrapper[4816]: I0311 12:02:40.512277 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" event={"ID":"ba5c6602-69d6-46be-a23b-fb4d6290a974","Type":"ContainerStarted","Data":"e209630d2c6430ce88eea44392e08d1ea6502f314f9a6b9b81af4242ac59ed97"} Mar 11 12:02:41 crc kubenswrapper[4816]: I0311 12:02:41.589260 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 12:02:42 crc kubenswrapper[4816]: I0311 12:02:42.403633 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 12:02:42 crc kubenswrapper[4816]: I0311 12:02:42.636566 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 12:02:45 crc kubenswrapper[4816]: I0311 12:02:45.946812 4816 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 12:02:45 crc kubenswrapper[4816]: I0311 12:02:45.947144 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" gracePeriod=5 Mar 11 12:02:46 crc kubenswrapper[4816]: I0311 12:02:46.531494 4816 csr.go:261] certificate signing request csr-mqscd is approved, waiting to be issued Mar 11 12:02:46 crc kubenswrapper[4816]: I0311 12:02:46.554133 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" event={"ID":"ba5c6602-69d6-46be-a23b-fb4d6290a974","Type":"ContainerStarted","Data":"2cfac82b0530dfec9409f269e3ee40d7a556b84403bec8f94f82329b0208a810"} Mar 11 12:02:46 crc kubenswrapper[4816]: I0311 12:02:46.556147 4816 csr.go:257] certificate signing request csr-mqscd is issued Mar 11 12:02:46 crc kubenswrapper[4816]: I0311 12:02:46.570059 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" podStartSLOduration=8.836376387 podStartE2EDuration="14.570039008s" podCreationTimestamp="2026-03-11 12:02:32 +0000 UTC" firstStartedPulling="2026-03-11 12:02:40.23119362 +0000 UTC m=+246.822457587" lastFinishedPulling="2026-03-11 12:02:45.964856241 +0000 UTC m=+252.556120208" observedRunningTime="2026-03-11 12:02:46.567514595 +0000 UTC m=+253.158778562" watchObservedRunningTime="2026-03-11 12:02:46.570039008 +0000 UTC m=+253.161302975" Mar 11 12:02:47 crc kubenswrapper[4816]: I0311 12:02:47.558058 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-04 14:38:11.924945389 +0000 UTC Mar 11 12:02:47 crc kubenswrapper[4816]: I0311 12:02:47.558356 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6434h35m24.366592592s for next certificate rotation Mar 11 12:02:47 crc kubenswrapper[4816]: I0311 12:02:47.560483 4816 generic.go:334] "Generic (PLEG): container finished" podID="ba5c6602-69d6-46be-a23b-fb4d6290a974" containerID="2cfac82b0530dfec9409f269e3ee40d7a556b84403bec8f94f82329b0208a810" exitCode=0 Mar 11 12:02:47 crc kubenswrapper[4816]: I0311 12:02:47.560520 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" event={"ID":"ba5c6602-69d6-46be-a23b-fb4d6290a974","Type":"ContainerDied","Data":"2cfac82b0530dfec9409f269e3ee40d7a556b84403bec8f94f82329b0208a810"} Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.558961 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-15 01:29:14.479290278 +0000 UTC Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.559014 4816 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7429h26m25.920279579s for next certificate rotation Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.804585 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.976116 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") pod \"ba5c6602-69d6-46be-a23b-fb4d6290a974\" (UID: \"ba5c6602-69d6-46be-a23b-fb4d6290a974\") " Mar 11 12:02:48 crc kubenswrapper[4816]: I0311 12:02:48.983187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm" (OuterVolumeSpecName: "kube-api-access-vcdsm") pod "ba5c6602-69d6-46be-a23b-fb4d6290a974" (UID: "ba5c6602-69d6-46be-a23b-fb4d6290a974"). InnerVolumeSpecName "kube-api-access-vcdsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:02:49 crc kubenswrapper[4816]: I0311 12:02:49.077606 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcdsm\" (UniqueName: \"kubernetes.io/projected/ba5c6602-69d6-46be-a23b-fb4d6290a974-kube-api-access-vcdsm\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:49 crc kubenswrapper[4816]: I0311 12:02:49.572989 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" event={"ID":"ba5c6602-69d6-46be-a23b-fb4d6290a974","Type":"ContainerDied","Data":"e209630d2c6430ce88eea44392e08d1ea6502f314f9a6b9b81af4242ac59ed97"} Mar 11 12:02:49 crc kubenswrapper[4816]: I0311 12:02:49.573041 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e209630d2c6430ce88eea44392e08d1ea6502f314f9a6b9b81af4242ac59ed97" Mar 11 12:02:49 crc kubenswrapper[4816]: I0311 12:02:49.573119 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553842-6xkxh" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.517094 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.517417 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.582271 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.582314 4816 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" exitCode=137 Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.582357 4816 scope.go:117] "RemoveContainer" containerID="010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.582394 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.596668 4816 scope.go:117] "RemoveContainer" containerID="010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" Mar 11 12:02:51 crc kubenswrapper[4816]: E0311 12:02:51.597191 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a\": container with ID starting with 010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a not found: ID does not exist" containerID="010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.597270 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a"} err="failed to get container status \"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a\": rpc error: code = NotFound desc = could not find container \"010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a\": container with ID starting with 010629f9f0bfab0e230041f23562998b4b2ec0dfca10bba643ba9582fca68b1a not found: ID does not exist" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608668 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608714 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608770 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608817 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.608839 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.609089 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.609130 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.609154 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.609175 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.614413 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710267 4816 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710307 4816 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710320 4816 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710332 4816 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:51 crc kubenswrapper[4816]: I0311 12:02:51.710346 4816 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 12:02:52 crc kubenswrapper[4816]: I0311 12:02:52.137546 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 11 12:03:01 crc kubenswrapper[4816]: I0311 12:03:01.638044 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerID="8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1" exitCode=0 Mar 11 12:03:01 crc kubenswrapper[4816]: I0311 12:03:01.638173 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerDied","Data":"8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1"} Mar 11 12:03:01 crc kubenswrapper[4816]: I0311 12:03:01.638902 4816 scope.go:117] "RemoveContainer" containerID="8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1" Mar 11 12:03:02 crc kubenswrapper[4816]: I0311 12:03:02.646737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerStarted","Data":"a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb"} Mar 11 12:03:02 crc kubenswrapper[4816]: I0311 12:03:02.647149 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:03:02 crc kubenswrapper[4816]: I0311 12:03:02.648406 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.514957 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.515694 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.515764 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.516548 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.516605 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2" gracePeriod=600 Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.684511 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2" exitCode=0 Mar 11 12:03:09 crc kubenswrapper[4816]: I0311 12:03:09.684592 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2"} Mar 11 12:03:10 crc kubenswrapper[4816]: I0311 12:03:10.692466 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c"} Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.977177 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p4bcz"] Mar 11 12:03:54 crc kubenswrapper[4816]: E0311 12:03:54.977929 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.977941 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 12:03:54 crc kubenswrapper[4816]: E0311 12:03:54.977965 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5c6602-69d6-46be-a23b-fb4d6290a974" containerName="oc" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.977972 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5c6602-69d6-46be-a23b-fb4d6290a974" containerName="oc" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.978062 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5c6602-69d6-46be-a23b-fb4d6290a974" containerName="oc" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.978074 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 12:03:54 crc kubenswrapper[4816]: I0311 12:03:54.978469 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.019748 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p4bcz"] Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105547 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-registry-tls\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105596 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-trusted-ca\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105623 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-registry-certificates\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105733 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9jkk\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-kube-api-access-q9jkk\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105785 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105827 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3cb18b94-7487-4088-9435-6c312a8727c0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105853 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-bound-sa-token\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.105877 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3cb18b94-7487-4088-9435-6c312a8727c0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.133031 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207206 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3cb18b94-7487-4088-9435-6c312a8727c0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207266 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-bound-sa-token\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207289 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3cb18b94-7487-4088-9435-6c312a8727c0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207318 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-registry-tls\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207341 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-trusted-ca\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207366 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-registry-certificates\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9jkk\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-kube-api-access-q9jkk\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.207899 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3cb18b94-7487-4088-9435-6c312a8727c0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.209088 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-trusted-ca\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.209403 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3cb18b94-7487-4088-9435-6c312a8727c0-registry-certificates\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.214384 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3cb18b94-7487-4088-9435-6c312a8727c0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.216860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-registry-tls\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.225725 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9jkk\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-kube-api-access-q9jkk\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.231039 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb18b94-7487-4088-9435-6c312a8727c0-bound-sa-token\") pod \"image-registry-66df7c8f76-p4bcz\" (UID: \"3cb18b94-7487-4088-9435-6c312a8727c0\") " pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.305775 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.720793 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p4bcz"] Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.974915 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" event={"ID":"3cb18b94-7487-4088-9435-6c312a8727c0","Type":"ContainerStarted","Data":"23df1dfd8b0ca574380b93362401165bb7788016f363855e168a1a69cd2ff738"} Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.975053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" event={"ID":"3cb18b94-7487-4088-9435-6c312a8727c0","Type":"ContainerStarted","Data":"cb4411c5ac3bd87f28176dc09c2ef533cb18b2299541a493db052cf7bb9ccf20"} Mar 11 12:03:55 crc kubenswrapper[4816]: I0311 12:03:55.977474 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:03:56 crc kubenswrapper[4816]: I0311 12:03:56.012450 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" podStartSLOduration=2.012435744 podStartE2EDuration="2.012435744s" podCreationTimestamp="2026-03-11 12:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:03:56.009751685 +0000 UTC m=+322.601015642" watchObservedRunningTime="2026-03-11 12:03:56.012435744 +0000 UTC m=+322.603699711" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.532785 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.534085 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9fv28" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" containerID="cri-o://362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.548769 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.549129 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jwq6f" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" containerID="cri-o://33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.569329 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.569530 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" containerID="cri-o://a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.576655 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.576877 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rlvrz" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="registry-server" containerID="cri-o://37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.588793 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.591625 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtm2c" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="registry-server" containerID="cri-o://b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f" gracePeriod=30 Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.604604 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m586v"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.605300 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.614234 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m586v"] Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.677901 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.677975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs29h\" (UniqueName: \"kubernetes.io/projected/e86ee6f4-c5ee-40dd-8e60-977add936dc1-kube-api-access-fs29h\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.677999 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.779587 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.780171 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs29h\" (UniqueName: \"kubernetes.io/projected/e86ee6f4-c5ee-40dd-8e60-977add936dc1-kube-api-access-fs29h\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.780956 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.782108 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.785385 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e86ee6f4-c5ee-40dd-8e60-977add936dc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.800422 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs29h\" (UniqueName: \"kubernetes.io/projected/e86ee6f4-c5ee-40dd-8e60-977add936dc1-kube-api-access-fs29h\") pod \"marketplace-operator-79b997595-m586v\" (UID: \"e86ee6f4-c5ee-40dd-8e60-977add936dc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.949551 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.961914 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:03:59 crc kubenswrapper[4816]: I0311 12:03:59.996127 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.007187 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008057 4816 generic.go:334] "Generic (PLEG): container finished" podID="e94af1b5-09ef-433f-91e6-7b352836273d" containerID="37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerDied","Data":"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008138 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlvrz" event={"ID":"e94af1b5-09ef-433f-91e6-7b352836273d","Type":"ContainerDied","Data":"a8cafecc50e94d07fe579d21307c54f31a39be731f47fedd9b733a84b5d89387"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008156 4816 scope.go:117] "RemoveContainer" containerID="37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.008294 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlvrz" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.022089 4816 generic.go:334] "Generic (PLEG): container finished" podID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerID="33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.022404 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerDied","Data":"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.022436 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwq6f" event={"ID":"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e","Type":"ContainerDied","Data":"499f7962c1697f289517091d9831d7c624088927518036ee83a281ffd5b62905"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.022510 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwq6f" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.025096 4816 generic.go:334] "Generic (PLEG): container finished" podID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerID="362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.025329 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fv28" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.025328 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerDied","Data":"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.033051 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fv28" event={"ID":"8d6e662d-8633-4e55-baf3-50a2c4d179a1","Type":"ContainerDied","Data":"e00a61b1b339e0c135f2f8629c96ed94976ec15fddfa98352c7a50768117327d"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.041296 4816 scope.go:117] "RemoveContainer" containerID="e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.045013 4816 generic.go:334] "Generic (PLEG): container finished" podID="ce281163-d6c0-444b-ba55-b488dd77b853" containerID="b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.045111 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerDied","Data":"b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.047225 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerID="a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb" exitCode=0 Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.047262 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerDied","Data":"a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb"} Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.053822 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.059229 4816 scope.go:117] "RemoveContainer" containerID="d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.079475 4816 scope.go:117] "RemoveContainer" containerID="37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.080337 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17\": container with ID starting with 37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17 not found: ID does not exist" containerID="37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080378 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17"} err="failed to get container status \"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17\": rpc error: code = NotFound desc = could not find container \"37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17\": container with ID starting with 37976cadeeaaabfe64d7c991892ed58c96dbf0539d9df222d8bb3ac68e91cc17 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080400 4816 scope.go:117] "RemoveContainer" containerID="e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080549 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.080816 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941\": container with ID starting with e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941 not found: ID does not exist" containerID="e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080846 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941"} err="failed to get container status \"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941\": rpc error: code = NotFound desc = could not find container \"e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941\": container with ID starting with e4f153012f5a35062f99ba998ca453204f279e596452667abbcfbcf6da4a9941 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.080860 4816 scope.go:117] "RemoveContainer" containerID="d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.081226 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc\": container with ID starting with d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc not found: ID does not exist" containerID="d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.081283 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc"} err="failed to get container status \"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc\": rpc error: code = NotFound desc = could not find container \"d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc\": container with ID starting with d8fabca9b2997290f11fbf07232f8d58b3654ac767d5341fa694844063002fdc not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.081315 4816 scope.go:117] "RemoveContainer" containerID="33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086569 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") pod \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086647 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") pod \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086673 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") pod \"e94af1b5-09ef-433f-91e6-7b352836273d\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086710 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") pod \"e94af1b5-09ef-433f-91e6-7b352836273d\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086746 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") pod \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086790 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") pod \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086810 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") pod \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\" (UID: \"fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086827 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") pod \"e94af1b5-09ef-433f-91e6-7b352836273d\" (UID: \"e94af1b5-09ef-433f-91e6-7b352836273d\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.086852 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") pod \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\" (UID: \"8d6e662d-8633-4e55-baf3-50a2c4d179a1\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.089858 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities" (OuterVolumeSpecName: "utilities") pod "8d6e662d-8633-4e55-baf3-50a2c4d179a1" (UID: "8d6e662d-8633-4e55-baf3-50a2c4d179a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.091211 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities" (OuterVolumeSpecName: "utilities") pod "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" (UID: "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.093909 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities" (OuterVolumeSpecName: "utilities") pod "e94af1b5-09ef-433f-91e6-7b352836273d" (UID: "e94af1b5-09ef-433f-91e6-7b352836273d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.094822 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg" (OuterVolumeSpecName: "kube-api-access-fz9fg") pod "8d6e662d-8633-4e55-baf3-50a2c4d179a1" (UID: "8d6e662d-8633-4e55-baf3-50a2c4d179a1"). InnerVolumeSpecName "kube-api-access-fz9fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.097296 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf" (OuterVolumeSpecName: "kube-api-access-xchpf") pod "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" (UID: "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e"). InnerVolumeSpecName "kube-api-access-xchpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.118319 4816 scope.go:117] "RemoveContainer" containerID="af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.140688 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq" (OuterVolumeSpecName: "kube-api-access-5lvwq") pod "e94af1b5-09ef-433f-91e6-7b352836273d" (UID: "e94af1b5-09ef-433f-91e6-7b352836273d"). InnerVolumeSpecName "kube-api-access-5lvwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.156833 4816 scope.go:117] "RemoveContainer" containerID="3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.164328 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e94af1b5-09ef-433f-91e6-7b352836273d" (UID: "e94af1b5-09ef-433f-91e6-7b352836273d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.177845 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d6e662d-8633-4e55-baf3-50a2c4d179a1" (UID: "8d6e662d-8633-4e55-baf3-50a2c4d179a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181342 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181619 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181636 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181653 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181661 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181672 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181680 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181692 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181702 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181711 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181718 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181728 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181735 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181746 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181753 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181761 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181769 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181817 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181826 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181837 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181845 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181856 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181865 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181877 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181885 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181893 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181902 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="extract-content" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.181912 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.181920 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="extract-utilities" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182033 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182046 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182061 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182074 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182088 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" containerName="registry-server" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.182567 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.184850 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.185118 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.185273 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.187183 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.187870 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") pod \"ce281163-d6c0-444b-ba55-b488dd77b853\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.187938 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") pod \"ce281163-d6c0-444b-ba55-b488dd77b853\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.187976 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") pod \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188008 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") pod \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188049 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") pod \"ce281163-d6c0-444b-ba55-b488dd77b853\" (UID: \"ce281163-d6c0-444b-ba55-b488dd77b853\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") pod \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\" (UID: \"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6\") " Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188373 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xchpf\" (UniqueName: \"kubernetes.io/projected/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-kube-api-access-xchpf\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188389 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvwq\" (UniqueName: \"kubernetes.io/projected/e94af1b5-09ef-433f-91e6-7b352836273d-kube-api-access-5lvwq\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188401 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188412 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188422 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188433 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e94af1b5-09ef-433f-91e6-7b352836273d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188443 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d6e662d-8633-4e55-baf3-50a2c4d179a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.188454 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz9fg\" (UniqueName: \"kubernetes.io/projected/8d6e662d-8633-4e55-baf3-50a2c4d179a1-kube-api-access-fz9fg\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.191093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" (UID: "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.191642 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz" (OuterVolumeSpecName: "kube-api-access-86nkz") pod "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" (UID: "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6"). InnerVolumeSpecName "kube-api-access-86nkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.192123 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" (UID: "1f8d6149-c5b0-4088-9db5-eeed2eef6ce6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.192229 4816 scope.go:117] "RemoveContainer" containerID="33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.192300 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities" (OuterVolumeSpecName: "utilities") pod "ce281163-d6c0-444b-ba55-b488dd77b853" (UID: "ce281163-d6c0-444b-ba55-b488dd77b853"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.199673 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" (UID: "fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.200174 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783\": container with ID starting with 33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783 not found: ID does not exist" containerID="33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200199 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9" (OuterVolumeSpecName: "kube-api-access-thth9") pod "ce281163-d6c0-444b-ba55-b488dd77b853" (UID: "ce281163-d6c0-444b-ba55-b488dd77b853"). InnerVolumeSpecName "kube-api-access-thth9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200263 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783"} err="failed to get container status \"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783\": rpc error: code = NotFound desc = could not find container \"33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783\": container with ID starting with 33fa1abaf83df4647d38f4486b6eeacba9e46e0cce2fe298d46d9eed8b730783 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200290 4816 scope.go:117] "RemoveContainer" containerID="af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.200876 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247\": container with ID starting with af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247 not found: ID does not exist" containerID="af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200910 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247"} err="failed to get container status \"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247\": rpc error: code = NotFound desc = could not find container \"af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247\": container with ID starting with af230f77d632f2ef7272588c7105b3e41503277008887f3bbb0bdc946fb02247 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.200939 4816 scope.go:117] "RemoveContainer" containerID="3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.201230 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd\": container with ID starting with 3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd not found: ID does not exist" containerID="3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.201295 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd"} err="failed to get container status \"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd\": rpc error: code = NotFound desc = could not find container \"3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd\": container with ID starting with 3f3e0d0db447a1ebe4e030b046c6446226abeed16eb18f4857b3f5d5fca2fdbd not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.201309 4816 scope.go:117] "RemoveContainer" containerID="362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.218995 4816 scope.go:117] "RemoveContainer" containerID="4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.238529 4816 scope.go:117] "RemoveContainer" containerID="1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.255170 4816 scope.go:117] "RemoveContainer" containerID="362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.255772 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98\": container with ID starting with 362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98 not found: ID does not exist" containerID="362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.255801 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98"} err="failed to get container status \"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98\": rpc error: code = NotFound desc = could not find container \"362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98\": container with ID starting with 362163a58c3530fa9e11ca63e8340195a0a89db0eb885f3d4d89779e7907bf98 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.255820 4816 scope.go:117] "RemoveContainer" containerID="4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.256074 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17\": container with ID starting with 4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17 not found: ID does not exist" containerID="4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.256348 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17"} err="failed to get container status \"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17\": rpc error: code = NotFound desc = could not find container \"4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17\": container with ID starting with 4105f5db45375627892a708c76ac931ccf2827e2e9b64e788c33b62ca6ee5c17 not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.256362 4816 scope.go:117] "RemoveContainer" containerID="1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb" Mar 11 12:04:00 crc kubenswrapper[4816]: E0311 12:04:00.256610 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb\": container with ID starting with 1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb not found: ID does not exist" containerID="1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.256631 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb"} err="failed to get container status \"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb\": rpc error: code = NotFound desc = could not find container \"1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb\": container with ID starting with 1fff89a6c486a4c24e56579ae8348f4ab713b43ed67023000094d7aea36a80cb not found: ID does not exist" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.256643 4816 scope.go:117] "RemoveContainer" containerID="8656da7afe12612a591590b5842a75afd40668d9dd72d7b01fcb55c35787a0e1" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290162 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") pod \"auto-csr-approver-29553844-df99q\" (UID: \"16125795-8697-470d-bc37-1ab8f6e31af1\") " pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290235 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290258 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86nkz\" (UniqueName: \"kubernetes.io/projected/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-kube-api-access-86nkz\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290267 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thth9\" (UniqueName: \"kubernetes.io/projected/ce281163-d6c0-444b-ba55-b488dd77b853-kube-api-access-thth9\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290279 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290287 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.290296 4816 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.336306 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.336536 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlvrz"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.348602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce281163-d6c0-444b-ba55-b488dd77b853" (UID: "ce281163-d6c0-444b-ba55-b488dd77b853"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.351165 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.354695 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jwq6f"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.389927 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.391829 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") pod \"auto-csr-approver-29553844-df99q\" (UID: \"16125795-8697-470d-bc37-1ab8f6e31af1\") " pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.391896 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce281163-d6c0-444b-ba55-b488dd77b853-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.391965 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9fv28"] Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.409999 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") pod \"auto-csr-approver-29553844-df99q\" (UID: \"16125795-8697-470d-bc37-1ab8f6e31af1\") " pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.479397 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m586v"] Mar 11 12:04:00 crc kubenswrapper[4816]: W0311 12:04:00.483172 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode86ee6f4_c5ee_40dd_8e60_977add936dc1.slice/crio-ba0e1127a9976371e0a2243c7d72c560e01aaec629769241e8e2aab44dfcf7ee WatchSource:0}: Error finding container ba0e1127a9976371e0a2243c7d72c560e01aaec629769241e8e2aab44dfcf7ee: Status 404 returned error can't find the container with id ba0e1127a9976371e0a2243c7d72c560e01aaec629769241e8e2aab44dfcf7ee Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.502443 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:00 crc kubenswrapper[4816]: I0311 12:04:00.688796 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:04:00 crc kubenswrapper[4816]: W0311 12:04:00.695456 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16125795_8697_470d_bc37_1ab8f6e31af1.slice/crio-ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9 WatchSource:0}: Error finding container ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9: Status 404 returned error can't find the container with id ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9 Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.056913 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtm2c" event={"ID":"ce281163-d6c0-444b-ba55-b488dd77b853","Type":"ContainerDied","Data":"7e759bdc79b60bb3704deb1a705f04eeaaf47ec7245e831ed02fd21393a8ffe0"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.056960 4816 scope.go:117] "RemoveContainer" containerID="b4ab0057fec3813a8eba57d93db34dca15d692fb5d18b567388c379b9637e53f" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.056978 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtm2c" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.059629 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.059930 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8gcm4" event={"ID":"1f8d6149-c5b0-4088-9db5-eeed2eef6ce6","Type":"ContainerDied","Data":"18da590f53c2a68db8ccc3639b30699431b029db82a4def3280157c1b87bba73"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.062121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553844-df99q" event={"ID":"16125795-8697-470d-bc37-1ab8f6e31af1","Type":"ContainerStarted","Data":"ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.065373 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" event={"ID":"e86ee6f4-c5ee-40dd-8e60-977add936dc1","Type":"ContainerStarted","Data":"027caf2d729990c1d9676988eefc8343957b31187ff5d9808f12331ab5090d22"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.065399 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" event={"ID":"e86ee6f4-c5ee-40dd-8e60-977add936dc1","Type":"ContainerStarted","Data":"ba0e1127a9976371e0a2243c7d72c560e01aaec629769241e8e2aab44dfcf7ee"} Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.065592 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.071265 4816 scope.go:117] "RemoveContainer" containerID="a7c62a5bd8897be83a617ee46b4e99b960de1d3c06824d144ebcc6c092953124" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.071414 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.096348 4816 scope.go:117] "RemoveContainer" containerID="3ab1f4b901f51b92d05dc18c4be8f53411d27fe11dfae52c40ff6b519e7e0cea" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.108270 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m586v" podStartSLOduration=2.108228275 podStartE2EDuration="2.108228275s" podCreationTimestamp="2026-03-11 12:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:04:01.098578984 +0000 UTC m=+327.689842951" watchObservedRunningTime="2026-03-11 12:04:01.108228275 +0000 UTC m=+327.699492242" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.121410 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.134236 4816 scope.go:117] "RemoveContainer" containerID="a866a70345c8481e25e2f58460d24a8a5c95dd9260f5acf809685fe8295ea5eb" Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.151144 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtm2c"] Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.157820 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:04:01 crc kubenswrapper[4816]: I0311 12:04:01.161463 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8gcm4"] Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.138286 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" path="/var/lib/kubelet/pods/1f8d6149-c5b0-4088-9db5-eeed2eef6ce6/volumes" Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.139051 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6e662d-8633-4e55-baf3-50a2c4d179a1" path="/var/lib/kubelet/pods/8d6e662d-8633-4e55-baf3-50a2c4d179a1/volumes" Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.139658 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce281163-d6c0-444b-ba55-b488dd77b853" path="/var/lib/kubelet/pods/ce281163-d6c0-444b-ba55-b488dd77b853/volumes" Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.140786 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94af1b5-09ef-433f-91e6-7b352836273d" path="/var/lib/kubelet/pods/e94af1b5-09ef-433f-91e6-7b352836273d/volumes" Mar 11 12:04:02 crc kubenswrapper[4816]: I0311 12:04:02.141412 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e" path="/var/lib/kubelet/pods/fd6e7ddf-4a19-45d8-ac3a-4960e3b26f4e/volumes" Mar 11 12:04:03 crc kubenswrapper[4816]: I0311 12:04:03.081908 4816 generic.go:334] "Generic (PLEG): container finished" podID="16125795-8697-470d-bc37-1ab8f6e31af1" containerID="3d89e5845eb14d7e6c90a432b751164398e20a4fe55d6026ce8f8ec622962660" exitCode=0 Mar 11 12:04:03 crc kubenswrapper[4816]: I0311 12:04:03.081990 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553844-df99q" event={"ID":"16125795-8697-470d-bc37-1ab8f6e31af1","Type":"ContainerDied","Data":"3d89e5845eb14d7e6c90a432b751164398e20a4fe55d6026ce8f8ec622962660"} Mar 11 12:04:04 crc kubenswrapper[4816]: I0311 12:04:04.303311 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:04 crc kubenswrapper[4816]: I0311 12:04:04.451817 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") pod \"16125795-8697-470d-bc37-1ab8f6e31af1\" (UID: \"16125795-8697-470d-bc37-1ab8f6e31af1\") " Mar 11 12:04:04 crc kubenswrapper[4816]: I0311 12:04:04.457469 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6" (OuterVolumeSpecName: "kube-api-access-67fg6") pod "16125795-8697-470d-bc37-1ab8f6e31af1" (UID: "16125795-8697-470d-bc37-1ab8f6e31af1"). InnerVolumeSpecName "kube-api-access-67fg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:04 crc kubenswrapper[4816]: I0311 12:04:04.553428 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67fg6\" (UniqueName: \"kubernetes.io/projected/16125795-8697-470d-bc37-1ab8f6e31af1-kube-api-access-67fg6\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:05 crc kubenswrapper[4816]: I0311 12:04:05.092693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553844-df99q" event={"ID":"16125795-8697-470d-bc37-1ab8f6e31af1","Type":"ContainerDied","Data":"ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9"} Mar 11 12:04:05 crc kubenswrapper[4816]: I0311 12:04:05.092748 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed45a5fe576914ea5f97f6cf2f2568eb5a9c46841daef8cba1a2cce70da3e5e9" Mar 11 12:04:05 crc kubenswrapper[4816]: I0311 12:04:05.093074 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553844-df99q" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.170361 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 12:04:12 crc kubenswrapper[4816]: E0311 12:04:12.171205 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16125795-8697-470d-bc37-1ab8f6e31af1" containerName="oc" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.171221 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="16125795-8697-470d-bc37-1ab8f6e31af1" containerName="oc" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.171354 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8d6149-c5b0-4088-9db5-eeed2eef6ce6" containerName="marketplace-operator" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.171370 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="16125795-8697-470d-bc37-1ab8f6e31af1" containerName="oc" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.172368 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.174793 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.184105 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.256994 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.257054 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.257098 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.359815 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.359881 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.359934 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.361012 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.361405 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.375840 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.381329 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.385644 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.387454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") pod \"community-operators-qb5pd\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.393241 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.460998 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.461073 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.461170 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.503238 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562039 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562510 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562576 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.562918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.588178 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") pod \"certified-operators-wlx2d\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.718391 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.898852 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 12:04:12 crc kubenswrapper[4816]: W0311 12:04:12.900995 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod963d27c0_f203_4997_aa60_ac73d2a54cc0.slice/crio-5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22 WatchSource:0}: Error finding container 5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22: Status 404 returned error can't find the container with id 5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22 Mar 11 12:04:12 crc kubenswrapper[4816]: I0311 12:04:12.901315 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:04:12 crc kubenswrapper[4816]: W0311 12:04:12.907815 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd456b988_0480_49fc_9667_03c56b871abe.slice/crio-e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258 WatchSource:0}: Error finding container e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258: Status 404 returned error can't find the container with id e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258 Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.136971 4816 generic.go:334] "Generic (PLEG): container finished" podID="d456b988-0480-49fc-9667-03c56b871abe" containerID="a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf" exitCode=0 Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.137045 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerDied","Data":"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf"} Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.137122 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerStarted","Data":"e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258"} Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.139725 4816 generic.go:334] "Generic (PLEG): container finished" podID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerID="6e58f19a27ae3010beb47e8be328d7c7ee7c8f14b5f34d2213706b6f25097290" exitCode=0 Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.139773 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerDied","Data":"6e58f19a27ae3010beb47e8be328d7c7ee7c8f14b5f34d2213706b6f25097290"} Mar 11 12:04:13 crc kubenswrapper[4816]: I0311 12:04:13.139803 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerStarted","Data":"5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22"} Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.147431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerStarted","Data":"8c235b1052133359e398ac00a2eee490f7a085338a2901f71eac6e872bda6cbf"} Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.151198 4816 generic.go:334] "Generic (PLEG): container finished" podID="d456b988-0480-49fc-9667-03c56b871abe" containerID="6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60" exitCode=0 Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.151319 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerDied","Data":"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60"} Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.772126 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9jq"] Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.775455 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.779350 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.785308 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9jq"] Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.898633 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-utilities\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.898725 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkbz\" (UniqueName: \"kubernetes.io/projected/991327ed-0ad5-4161-a218-598e50bbafe9-kube-api-access-6hkbz\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.898870 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-catalog-content\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.970527 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4czr8"] Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.972077 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.976554 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 12:04:14 crc kubenswrapper[4816]: I0311 12:04:14.980007 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4czr8"] Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.000637 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-catalog-content\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.000743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-utilities\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.000789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkbz\" (UniqueName: \"kubernetes.io/projected/991327ed-0ad5-4161-a218-598e50bbafe9-kube-api-access-6hkbz\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.001287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-catalog-content\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.001410 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/991327ed-0ad5-4161-a218-598e50bbafe9-utilities\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.036806 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkbz\" (UniqueName: \"kubernetes.io/projected/991327ed-0ad5-4161-a218-598e50bbafe9-kube-api-access-6hkbz\") pod \"redhat-marketplace-9f9jq\" (UID: \"991327ed-0ad5-4161-a218-598e50bbafe9\") " pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.094659 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.102023 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fpm7\" (UniqueName: \"kubernetes.io/projected/08bf2596-9393-42d3-9b76-461be3ee0c22-kube-api-access-7fpm7\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.102115 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-utilities\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.102134 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-catalog-content\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.165541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerStarted","Data":"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f"} Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.168825 4816 generic.go:334] "Generic (PLEG): container finished" podID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerID="8c235b1052133359e398ac00a2eee490f7a085338a2901f71eac6e872bda6cbf" exitCode=0 Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.168881 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerDied","Data":"8c235b1052133359e398ac00a2eee490f7a085338a2901f71eac6e872bda6cbf"} Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.185178 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wlx2d" podStartSLOduration=1.6543980729999999 podStartE2EDuration="3.185161604s" podCreationTimestamp="2026-03-11 12:04:12 +0000 UTC" firstStartedPulling="2026-03-11 12:04:13.138604745 +0000 UTC m=+339.729868712" lastFinishedPulling="2026-03-11 12:04:14.669368276 +0000 UTC m=+341.260632243" observedRunningTime="2026-03-11 12:04:15.184749052 +0000 UTC m=+341.776013019" watchObservedRunningTime="2026-03-11 12:04:15.185161604 +0000 UTC m=+341.776425571" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.207355 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-utilities\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.207392 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-catalog-content\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.207430 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fpm7\" (UniqueName: \"kubernetes.io/projected/08bf2596-9393-42d3-9b76-461be3ee0c22-kube-api-access-7fpm7\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.208206 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-utilities\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.208434 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08bf2596-9393-42d3-9b76-461be3ee0c22-catalog-content\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.246434 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fpm7\" (UniqueName: \"kubernetes.io/projected/08bf2596-9393-42d3-9b76-461be3ee0c22-kube-api-access-7fpm7\") pod \"redhat-operators-4czr8\" (UID: \"08bf2596-9393-42d3-9b76-461be3ee0c22\") " pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.315820 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p4bcz" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.347303 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.372692 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.563957 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4czr8"] Mar 11 12:04:15 crc kubenswrapper[4816]: I0311 12:04:15.596507 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9jq"] Mar 11 12:04:15 crc kubenswrapper[4816]: W0311 12:04:15.602775 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991327ed_0ad5_4161_a218_598e50bbafe9.slice/crio-d0036afadbd6ee40c4d716a9f9bbe951341dd42bc2d4900d369407d3de2aaca0 WatchSource:0}: Error finding container d0036afadbd6ee40c4d716a9f9bbe951341dd42bc2d4900d369407d3de2aaca0: Status 404 returned error can't find the container with id d0036afadbd6ee40c4d716a9f9bbe951341dd42bc2d4900d369407d3de2aaca0 Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.179353 4816 generic.go:334] "Generic (PLEG): container finished" podID="991327ed-0ad5-4161-a218-598e50bbafe9" containerID="276b5a20d3bef5cecb98f046cf1ab8c761311214dde470e9bf10f1198b21e2c2" exitCode=0 Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.179637 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9jq" event={"ID":"991327ed-0ad5-4161-a218-598e50bbafe9","Type":"ContainerDied","Data":"276b5a20d3bef5cecb98f046cf1ab8c761311214dde470e9bf10f1198b21e2c2"} Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.179804 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9jq" event={"ID":"991327ed-0ad5-4161-a218-598e50bbafe9","Type":"ContainerStarted","Data":"d0036afadbd6ee40c4d716a9f9bbe951341dd42bc2d4900d369407d3de2aaca0"} Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.184024 4816 generic.go:334] "Generic (PLEG): container finished" podID="08bf2596-9393-42d3-9b76-461be3ee0c22" containerID="f6e708c07beedc927d88fe860160fe134356afe7b3a445a39a8afeb3d7fe107a" exitCode=0 Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.184244 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czr8" event={"ID":"08bf2596-9393-42d3-9b76-461be3ee0c22","Type":"ContainerDied","Data":"f6e708c07beedc927d88fe860160fe134356afe7b3a445a39a8afeb3d7fe107a"} Mar 11 12:04:16 crc kubenswrapper[4816]: I0311 12:04:16.184326 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czr8" event={"ID":"08bf2596-9393-42d3-9b76-461be3ee0c22","Type":"ContainerStarted","Data":"0bfb7b65242e9132f4eeb09297c51306c673c57d0632f40983551ce70feb2ca5"} Mar 11 12:04:17 crc kubenswrapper[4816]: I0311 12:04:17.191071 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerStarted","Data":"df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336"} Mar 11 12:04:17 crc kubenswrapper[4816]: I0311 12:04:17.193101 4816 generic.go:334] "Generic (PLEG): container finished" podID="991327ed-0ad5-4161-a218-598e50bbafe9" containerID="3a5a38f455f732668a196f091e014f4ac2c18dd8d3f055e916189b87f4ab5984" exitCode=0 Mar 11 12:04:17 crc kubenswrapper[4816]: I0311 12:04:17.193153 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9jq" event={"ID":"991327ed-0ad5-4161-a218-598e50bbafe9","Type":"ContainerDied","Data":"3a5a38f455f732668a196f091e014f4ac2c18dd8d3f055e916189b87f4ab5984"} Mar 11 12:04:17 crc kubenswrapper[4816]: I0311 12:04:17.209268 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qb5pd" podStartSLOduration=2.228415326 podStartE2EDuration="5.209235396s" podCreationTimestamp="2026-03-11 12:04:12 +0000 UTC" firstStartedPulling="2026-03-11 12:04:13.141173712 +0000 UTC m=+339.732437679" lastFinishedPulling="2026-03-11 12:04:16.121993782 +0000 UTC m=+342.713257749" observedRunningTime="2026-03-11 12:04:17.206313797 +0000 UTC m=+343.797577764" watchObservedRunningTime="2026-03-11 12:04:17.209235396 +0000 UTC m=+343.800499363" Mar 11 12:04:18 crc kubenswrapper[4816]: I0311 12:04:18.200475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9jq" event={"ID":"991327ed-0ad5-4161-a218-598e50bbafe9","Type":"ContainerStarted","Data":"1dcfedce7d5dd95fbbcd4585d201214260e6231ba9c368710f19185b969935ea"} Mar 11 12:04:18 crc kubenswrapper[4816]: I0311 12:04:18.202069 4816 generic.go:334] "Generic (PLEG): container finished" podID="08bf2596-9393-42d3-9b76-461be3ee0c22" containerID="779fac97e6262ed086b95c9507f877a519a9ebc4041e2dc8f6025e304e1b6964" exitCode=0 Mar 11 12:04:18 crc kubenswrapper[4816]: I0311 12:04:18.202125 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czr8" event={"ID":"08bf2596-9393-42d3-9b76-461be3ee0c22","Type":"ContainerDied","Data":"779fac97e6262ed086b95c9507f877a519a9ebc4041e2dc8f6025e304e1b6964"} Mar 11 12:04:18 crc kubenswrapper[4816]: I0311 12:04:18.217492 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9f9jq" podStartSLOduration=2.798062164 podStartE2EDuration="4.217473297s" podCreationTimestamp="2026-03-11 12:04:14 +0000 UTC" firstStartedPulling="2026-03-11 12:04:16.181394874 +0000 UTC m=+342.772658851" lastFinishedPulling="2026-03-11 12:04:17.600806017 +0000 UTC m=+344.192069984" observedRunningTime="2026-03-11 12:04:18.216146767 +0000 UTC m=+344.807410734" watchObservedRunningTime="2026-03-11 12:04:18.217473297 +0000 UTC m=+344.808737264" Mar 11 12:04:19 crc kubenswrapper[4816]: I0311 12:04:19.209361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4czr8" event={"ID":"08bf2596-9393-42d3-9b76-461be3ee0c22","Type":"ContainerStarted","Data":"d99aa505afb58c2dbaf5d0203d020f8764555084a9ed6334aa20ae5cb8b3b88c"} Mar 11 12:04:19 crc kubenswrapper[4816]: I0311 12:04:19.228191 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4czr8" podStartSLOduration=2.7697303399999997 podStartE2EDuration="5.228177503s" podCreationTimestamp="2026-03-11 12:04:14 +0000 UTC" firstStartedPulling="2026-03-11 12:04:16.18624547 +0000 UTC m=+342.777509477" lastFinishedPulling="2026-03-11 12:04:18.644692683 +0000 UTC m=+345.235956640" observedRunningTime="2026-03-11 12:04:19.227014018 +0000 UTC m=+345.818277985" watchObservedRunningTime="2026-03-11 12:04:19.228177503 +0000 UTC m=+345.819441470" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.503636 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.505407 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.565696 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.718636 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.718694 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:22 crc kubenswrapper[4816]: I0311 12:04:22.759321 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:23 crc kubenswrapper[4816]: I0311 12:04:23.273222 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:04:23 crc kubenswrapper[4816]: I0311 12:04:23.275261 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.095438 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.095710 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.141729 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.286235 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9f9jq" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.348257 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:25 crc kubenswrapper[4816]: I0311 12:04:25.348344 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:26 crc kubenswrapper[4816]: I0311 12:04:26.392641 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4czr8" podUID="08bf2596-9393-42d3-9b76-461be3ee0c22" containerName="registry-server" probeResult="failure" output=< Mar 11 12:04:26 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:04:26 crc kubenswrapper[4816]: > Mar 11 12:04:35 crc kubenswrapper[4816]: I0311 12:04:35.382506 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:35 crc kubenswrapper[4816]: I0311 12:04:35.432483 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4czr8" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.408853 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" podUID="9a7e3709-d407-4679-add6-375a835421be" containerName="registry" containerID="cri-o://29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" gracePeriod=30 Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.735593 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866076 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866101 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866286 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866344 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866386 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.866446 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.867088 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") pod \"9a7e3709-d407-4679-add6-375a835421be\" (UID: \"9a7e3709-d407-4679-add6-375a835421be\") " Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.868112 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.868398 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.873571 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.874137 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd" (OuterVolumeSpecName: "kube-api-access-6qwwd") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "kube-api-access-6qwwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.875798 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.877331 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.877660 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.882283 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9a7e3709-d407-4679-add6-375a835421be" (UID: "9a7e3709-d407-4679-add6-375a835421be"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968844 4816 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968872 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qwwd\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-kube-api-access-6qwwd\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968882 4816 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968893 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a7e3709-d407-4679-add6-375a835421be-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968904 4816 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a7e3709-d407-4679-add6-375a835421be-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968915 4816 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a7e3709-d407-4679-add6-375a835421be-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:40 crc kubenswrapper[4816]: I0311 12:04:40.968923 4816 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a7e3709-d407-4679-add6-375a835421be-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.338920 4816 generic.go:334] "Generic (PLEG): container finished" podID="9a7e3709-d407-4679-add6-375a835421be" containerID="29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" exitCode=0 Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.338964 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.338976 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" event={"ID":"9a7e3709-d407-4679-add6-375a835421be","Type":"ContainerDied","Data":"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf"} Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.339027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p426k" event={"ID":"9a7e3709-d407-4679-add6-375a835421be","Type":"ContainerDied","Data":"ec23157cec86a7144fad1cf7ce6f1de12230714b1e857a2199a9972f099db0a1"} Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.339049 4816 scope.go:117] "RemoveContainer" containerID="29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.355711 4816 scope.go:117] "RemoveContainer" containerID="29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" Mar 11 12:04:41 crc kubenswrapper[4816]: E0311 12:04:41.358096 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf\": container with ID starting with 29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf not found: ID does not exist" containerID="29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.358148 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf"} err="failed to get container status \"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf\": rpc error: code = NotFound desc = could not find container \"29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf\": container with ID starting with 29a5575a4698992467da37317f3822ef73493ffefd321500908342f4c01a8fdf not found: ID does not exist" Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.364427 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:04:41 crc kubenswrapper[4816]: I0311 12:04:41.367711 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p426k"] Mar 11 12:04:42 crc kubenswrapper[4816]: I0311 12:04:42.137214 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7e3709-d407-4679-add6-375a835421be" path="/var/lib/kubelet/pods/9a7e3709-d407-4679-add6-375a835421be/volumes" Mar 11 12:05:39 crc kubenswrapper[4816]: I0311 12:05:39.514838 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:05:39 crc kubenswrapper[4816]: I0311 12:05:39.515424 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.146540 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:06:00 crc kubenswrapper[4816]: E0311 12:06:00.147430 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7e3709-d407-4679-add6-375a835421be" containerName="registry" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.147454 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7e3709-d407-4679-add6-375a835421be" containerName="registry" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.147760 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7e3709-d407-4679-add6-375a835421be" containerName="registry" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.148502 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.149912 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.151623 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.151668 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.152014 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.214768 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") pod \"auto-csr-approver-29553846-v5vlq\" (UID: \"8860b8d2-719a-4930-9df3-d0bc14d8de19\") " pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.316170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") pod \"auto-csr-approver-29553846-v5vlq\" (UID: \"8860b8d2-719a-4930-9df3-d0bc14d8de19\") " pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.335817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") pod \"auto-csr-approver-29553846-v5vlq\" (UID: \"8860b8d2-719a-4930-9df3-d0bc14d8de19\") " pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.468538 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.638937 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:06:00 crc kubenswrapper[4816]: I0311 12:06:00.647108 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:06:01 crc kubenswrapper[4816]: I0311 12:06:01.011111 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" event={"ID":"8860b8d2-719a-4930-9df3-d0bc14d8de19","Type":"ContainerStarted","Data":"09371881040e37ad815ab74ee53fb47977ad2f8c78f64b5e3d2d140a71ec6726"} Mar 11 12:06:02 crc kubenswrapper[4816]: I0311 12:06:02.017138 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" event={"ID":"8860b8d2-719a-4930-9df3-d0bc14d8de19","Type":"ContainerStarted","Data":"587884e89c7feed672cb66139bc979bafcbc72560d3687797e97ca922f238ebb"} Mar 11 12:06:02 crc kubenswrapper[4816]: I0311 12:06:02.035654 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" podStartSLOduration=1.070459283 podStartE2EDuration="2.035622694s" podCreationTimestamp="2026-03-11 12:06:00 +0000 UTC" firstStartedPulling="2026-03-11 12:06:00.646888137 +0000 UTC m=+447.238152094" lastFinishedPulling="2026-03-11 12:06:01.612051508 +0000 UTC m=+448.203315505" observedRunningTime="2026-03-11 12:06:02.030331885 +0000 UTC m=+448.621595852" watchObservedRunningTime="2026-03-11 12:06:02.035622694 +0000 UTC m=+448.626886701" Mar 11 12:06:03 crc kubenswrapper[4816]: I0311 12:06:03.027765 4816 generic.go:334] "Generic (PLEG): container finished" podID="8860b8d2-719a-4930-9df3-d0bc14d8de19" containerID="587884e89c7feed672cb66139bc979bafcbc72560d3687797e97ca922f238ebb" exitCode=0 Mar 11 12:06:03 crc kubenswrapper[4816]: I0311 12:06:03.029089 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" event={"ID":"8860b8d2-719a-4930-9df3-d0bc14d8de19","Type":"ContainerDied","Data":"587884e89c7feed672cb66139bc979bafcbc72560d3687797e97ca922f238ebb"} Mar 11 12:06:04 crc kubenswrapper[4816]: I0311 12:06:04.292811 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:04 crc kubenswrapper[4816]: I0311 12:06:04.371690 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") pod \"8860b8d2-719a-4930-9df3-d0bc14d8de19\" (UID: \"8860b8d2-719a-4930-9df3-d0bc14d8de19\") " Mar 11 12:06:04 crc kubenswrapper[4816]: I0311 12:06:04.383478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh" (OuterVolumeSpecName: "kube-api-access-dj5vh") pod "8860b8d2-719a-4930-9df3-d0bc14d8de19" (UID: "8860b8d2-719a-4930-9df3-d0bc14d8de19"). InnerVolumeSpecName "kube-api-access-dj5vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:06:04 crc kubenswrapper[4816]: I0311 12:06:04.474569 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj5vh\" (UniqueName: \"kubernetes.io/projected/8860b8d2-719a-4930-9df3-d0bc14d8de19-kube-api-access-dj5vh\") on node \"crc\" DevicePath \"\"" Mar 11 12:06:05 crc kubenswrapper[4816]: I0311 12:06:05.048935 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" event={"ID":"8860b8d2-719a-4930-9df3-d0bc14d8de19","Type":"ContainerDied","Data":"09371881040e37ad815ab74ee53fb47977ad2f8c78f64b5e3d2d140a71ec6726"} Mar 11 12:06:05 crc kubenswrapper[4816]: I0311 12:06:05.048986 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09371881040e37ad815ab74ee53fb47977ad2f8c78f64b5e3d2d140a71ec6726" Mar 11 12:06:05 crc kubenswrapper[4816]: I0311 12:06:05.049045 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553846-v5vlq" Mar 11 12:06:09 crc kubenswrapper[4816]: I0311 12:06:09.515308 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:06:09 crc kubenswrapper[4816]: I0311 12:06:09.515647 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.515203 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.515993 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.516074 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.517100 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.517235 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c" gracePeriod=600 Mar 11 12:06:39 crc kubenswrapper[4816]: E0311 12:06:39.633077 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fdff21c_644f_4443_a268_f98c91ea120a.slice/crio-a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.687807 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c" exitCode=0 Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.687872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c"} Mar 11 12:06:39 crc kubenswrapper[4816]: I0311 12:06:39.688050 4816 scope.go:117] "RemoveContainer" containerID="fcc062c271cd12993a2f94302ad7910d23ab33f9e9c36dd18bc3d6cf66582bc2" Mar 11 12:06:40 crc kubenswrapper[4816]: I0311 12:06:40.694155 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca"} Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.137580 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:08:00 crc kubenswrapper[4816]: E0311 12:08:00.138366 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8860b8d2-719a-4930-9df3-d0bc14d8de19" containerName="oc" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.138381 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8860b8d2-719a-4930-9df3-d0bc14d8de19" containerName="oc" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.138517 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8860b8d2-719a-4930-9df3-d0bc14d8de19" containerName="oc" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.138953 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.140969 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.141943 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.142300 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.142868 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.247787 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") pod \"auto-csr-approver-29553848-blbgg\" (UID: \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\") " pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.349402 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") pod \"auto-csr-approver-29553848-blbgg\" (UID: \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\") " pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.367041 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") pod \"auto-csr-approver-29553848-blbgg\" (UID: \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\") " pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.457478 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:00 crc kubenswrapper[4816]: I0311 12:08:00.627791 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:08:01 crc kubenswrapper[4816]: I0311 12:08:01.189175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553848-blbgg" event={"ID":"cf7a354c-a3ec-44fe-8e27-028abd12d7d9","Type":"ContainerStarted","Data":"464cd99aebb8d1992d92bdb9f36912fa5157a5dd9a45577e3df7d1d25b868228"} Mar 11 12:08:02 crc kubenswrapper[4816]: I0311 12:08:02.210012 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553848-blbgg" event={"ID":"cf7a354c-a3ec-44fe-8e27-028abd12d7d9","Type":"ContainerStarted","Data":"ce753e08f0c29759ce4abeca1c2ba4ffc8217be9eee018a375b073d4682d5231"} Mar 11 12:08:02 crc kubenswrapper[4816]: I0311 12:08:02.227056 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553848-blbgg" podStartSLOduration=0.966789945 podStartE2EDuration="2.227037369s" podCreationTimestamp="2026-03-11 12:08:00 +0000 UTC" firstStartedPulling="2026-03-11 12:08:00.63686043 +0000 UTC m=+567.228124397" lastFinishedPulling="2026-03-11 12:08:01.897107854 +0000 UTC m=+568.488371821" observedRunningTime="2026-03-11 12:08:02.223103277 +0000 UTC m=+568.814367244" watchObservedRunningTime="2026-03-11 12:08:02.227037369 +0000 UTC m=+568.818301336" Mar 11 12:08:03 crc kubenswrapper[4816]: I0311 12:08:03.216426 4816 generic.go:334] "Generic (PLEG): container finished" podID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" containerID="ce753e08f0c29759ce4abeca1c2ba4ffc8217be9eee018a375b073d4682d5231" exitCode=0 Mar 11 12:08:03 crc kubenswrapper[4816]: I0311 12:08:03.216477 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553848-blbgg" event={"ID":"cf7a354c-a3ec-44fe-8e27-028abd12d7d9","Type":"ContainerDied","Data":"ce753e08f0c29759ce4abeca1c2ba4ffc8217be9eee018a375b073d4682d5231"} Mar 11 12:08:04 crc kubenswrapper[4816]: I0311 12:08:04.516607 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:04 crc kubenswrapper[4816]: I0311 12:08:04.708135 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") pod \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\" (UID: \"cf7a354c-a3ec-44fe-8e27-028abd12d7d9\") " Mar 11 12:08:04 crc kubenswrapper[4816]: I0311 12:08:04.717799 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx" (OuterVolumeSpecName: "kube-api-access-5njsx") pod "cf7a354c-a3ec-44fe-8e27-028abd12d7d9" (UID: "cf7a354c-a3ec-44fe-8e27-028abd12d7d9"). InnerVolumeSpecName "kube-api-access-5njsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:08:04 crc kubenswrapper[4816]: I0311 12:08:04.809815 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5njsx\" (UniqueName: \"kubernetes.io/projected/cf7a354c-a3ec-44fe-8e27-028abd12d7d9-kube-api-access-5njsx\") on node \"crc\" DevicePath \"\"" Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.229053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553848-blbgg" event={"ID":"cf7a354c-a3ec-44fe-8e27-028abd12d7d9","Type":"ContainerDied","Data":"464cd99aebb8d1992d92bdb9f36912fa5157a5dd9a45577e3df7d1d25b868228"} Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.229087 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464cd99aebb8d1992d92bdb9f36912fa5157a5dd9a45577e3df7d1d25b868228" Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.229091 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553848-blbgg" Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.291732 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:08:05 crc kubenswrapper[4816]: I0311 12:08:05.296421 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553842-6xkxh"] Mar 11 12:08:06 crc kubenswrapper[4816]: I0311 12:08:06.137284 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba5c6602-69d6-46be-a23b-fb4d6290a974" path="/var/lib/kubelet/pods/ba5c6602-69d6-46be-a23b-fb4d6290a974/volumes" Mar 11 12:08:39 crc kubenswrapper[4816]: I0311 12:08:39.516027 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:08:39 crc kubenswrapper[4816]: I0311 12:08:39.518595 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:09:09 crc kubenswrapper[4816]: I0311 12:09:09.515239 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:09:09 crc kubenswrapper[4816]: I0311 12:09:09.516514 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:09:34 crc kubenswrapper[4816]: I0311 12:09:34.426515 4816 scope.go:117] "RemoveContainer" containerID="2cfac82b0530dfec9409f269e3ee40d7a556b84403bec8f94f82329b0208a810" Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.515445 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.515908 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.515976 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.517884 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.517999 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca" gracePeriod=600 Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.821621 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca" exitCode=0 Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.821707 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca"} Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.822016 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3"} Mar 11 12:09:39 crc kubenswrapper[4816]: I0311 12:09:39.822042 4816 scope.go:117] "RemoveContainer" containerID="a1d4bcf565d8188182640fd6dfae19dbf3e118e4747e8a92039a28e6b5c3c95c" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.148003 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:10:00 crc kubenswrapper[4816]: E0311 12:10:00.148821 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" containerName="oc" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.148838 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" containerName="oc" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.149002 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" containerName="oc" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.149426 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.152023 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.152175 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.153987 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.159942 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.310862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") pod \"auto-csr-approver-29553850-v7tlf\" (UID: \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\") " pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.412550 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") pod \"auto-csr-approver-29553850-v7tlf\" (UID: \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\") " pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.440642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") pod \"auto-csr-approver-29553850-v7tlf\" (UID: \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\") " pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.474004 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.682718 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:10:00 crc kubenswrapper[4816]: W0311 12:10:00.687912 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ac620ec_72d5_4603_852f_8ba3f1ad0e9b.slice/crio-ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd WatchSource:0}: Error finding container ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd: Status 404 returned error can't find the container with id ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd Mar 11 12:10:00 crc kubenswrapper[4816]: I0311 12:10:00.953520 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" event={"ID":"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b","Type":"ContainerStarted","Data":"ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd"} Mar 11 12:10:02 crc kubenswrapper[4816]: E0311 12:10:02.306507 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ac620ec_72d5_4603_852f_8ba3f1ad0e9b.slice/crio-conmon-c3ad155fc5f3f7204d5fb77b61c79c6603bc6f42436d74dbc3171b2dbf21bbd2.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:10:02 crc kubenswrapper[4816]: I0311 12:10:02.967959 4816 generic.go:334] "Generic (PLEG): container finished" podID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" containerID="c3ad155fc5f3f7204d5fb77b61c79c6603bc6f42436d74dbc3171b2dbf21bbd2" exitCode=0 Mar 11 12:10:02 crc kubenswrapper[4816]: I0311 12:10:02.968018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" event={"ID":"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b","Type":"ContainerDied","Data":"c3ad155fc5f3f7204d5fb77b61c79c6603bc6f42436d74dbc3171b2dbf21bbd2"} Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.263921 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.364207 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") pod \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\" (UID: \"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b\") " Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.375576 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5" (OuterVolumeSpecName: "kube-api-access-846l5") pod "5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" (UID: "5ac620ec-72d5-4603-852f-8ba3f1ad0e9b"). InnerVolumeSpecName "kube-api-access-846l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.466270 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-846l5\" (UniqueName: \"kubernetes.io/projected/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b-kube-api-access-846l5\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.981627 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" event={"ID":"5ac620ec-72d5-4603-852f-8ba3f1ad0e9b","Type":"ContainerDied","Data":"ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd"} Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.981668 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553850-v7tlf" Mar 11 12:10:04 crc kubenswrapper[4816]: I0311 12:10:04.981672 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca70e525a30b28a721f12c5d635dc0f8922cf4f327d7e56a5a1758282eb289bd" Mar 11 12:10:05 crc kubenswrapper[4816]: I0311 12:10:05.333652 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:10:05 crc kubenswrapper[4816]: I0311 12:10:05.337548 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553844-df99q"] Mar 11 12:10:06 crc kubenswrapper[4816]: I0311 12:10:06.141412 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16125795-8697-470d-bc37-1ab8f6e31af1" path="/var/lib/kubelet/pods/16125795-8697-470d-bc37-1ab8f6e31af1/volumes" Mar 11 12:10:34 crc kubenswrapper[4816]: I0311 12:10:34.495516 4816 scope.go:117] "RemoveContainer" containerID="3d89e5845eb14d7e6c90a432b751164398e20a4fe55d6026ce8f8ec622962660" Mar 11 12:10:37 crc kubenswrapper[4816]: I0311 12:10:37.283835 4816 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.228481 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkh2h"] Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234378 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234404 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="nbdb" containerID="cri-o://45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234554 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="northd" containerID="cri-o://62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234785 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-node" containerID="cri-o://9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234853 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-acl-logging" containerID="cri-o://bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234812 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="sbdb" containerID="cri-o://a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.234312 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-controller" containerID="cri-o://ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.277549 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovnkube-controller" containerID="cri-o://6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" gracePeriod=30 Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.576992 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dkh2h_8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/ovn-acl-logging/0.log" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.578130 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dkh2h_8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/ovn-controller/0.log" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.578792 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658442 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658487 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658504 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658519 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658538 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658550 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658568 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658587 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658605 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658630 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658638 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658663 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658668 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash" (OuterVolumeSpecName: "host-slash") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658686 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658729 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658750 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658800 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658815 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658835 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658687 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log" (OuterVolumeSpecName: "node-log") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658883 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket" (OuterVolumeSpecName: "log-socket") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658910 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658851 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659077 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") pod \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\" (UID: \"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528\") " Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659132 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659160 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658703 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658735 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659175 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658755 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658808 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658812 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.658859 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659415 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659843 4816 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659883 4816 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659898 4816 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659913 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659935 4816 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659958 4816 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-slash\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659976 4816 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.659995 4816 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660011 4816 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660032 4816 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660051 4816 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660068 4816 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-node-log\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660087 4816 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660104 4816 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660119 4816 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660131 4816 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-log-socket\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.660143 4816 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.668002 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk" (OuterVolumeSpecName: "kube-api-access-dj5rk") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "kube-api-access-dj5rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.671443 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685415 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hhq62"] Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685649 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="nbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685664 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="nbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685674 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovnkube-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685680 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovnkube-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685688 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kubecfg-setup" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685695 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kubecfg-setup" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685706 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685712 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685723 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" containerName="oc" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685731 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" containerName="oc" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685738 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685744 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685751 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="sbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685757 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="sbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685765 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="northd" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685770 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="northd" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685777 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-acl-logging" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685783 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-acl-logging" Mar 11 12:10:48 crc kubenswrapper[4816]: E0311 12:10:48.685791 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-node" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685796 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-node" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685886 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685896 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="northd" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685903 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="sbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685912 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685918 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovnkube-controller" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685924 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" containerName="oc" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685932 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="ovn-acl-logging" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685939 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="kube-rbac-proxy-node" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.685946 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerName="nbdb" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.687639 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.689778 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" (UID: "8fbe3bb6-8bf9-40b5-8f4f-0d136e285528"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761202 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-etc-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761259 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovn-node-metrics-cert\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761281 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-log-socket\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761302 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-script-lib\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761412 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-config\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761443 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-var-lib-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-ovn\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761654 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-kubelet\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761691 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-netns\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761706 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-systemd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xlxm\" (UniqueName: \"kubernetes.io/projected/37b06e28-edcf-42e0-b392-7a1bc070f06d-kube-api-access-5xlxm\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761798 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-slash\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761843 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761867 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-bin\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761889 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.761997 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-node-log\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762027 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762142 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-netd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762186 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-env-overrides\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762212 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-systemd-units\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762329 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762359 4816 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.762373 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj5rk\" (UniqueName: \"kubernetes.io/projected/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528-kube-api-access-dj5rk\") on node \"crc\" DevicePath \"\"" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863816 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-ovn\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-kubelet\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863912 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-netns\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863925 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-systemd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863932 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-ovn\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863999 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-netns\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864006 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-kubelet\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.863946 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xlxm\" (UniqueName: \"kubernetes.io/projected/37b06e28-edcf-42e0-b392-7a1bc070f06d-kube-api-access-5xlxm\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864074 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-slash\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864093 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-systemd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864141 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864121 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-run-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864173 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-slash\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864183 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-bin\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864204 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864235 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-node-log\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864272 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864295 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864310 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-node-log\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864326 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864345 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-netd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864366 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-env-overrides\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864388 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-systemd-units\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864418 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-netd\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864407 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-host-cni-bin\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864489 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-etc-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-etc-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864450 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-systemd-units\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864684 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovn-node-metrics-cert\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864806 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-log-socket\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864897 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-script-lib\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-env-overrides\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864948 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-log-socket\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.864955 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-config\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.865076 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-var-lib-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.865240 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b06e28-edcf-42e0-b392-7a1bc070f06d-var-lib-openvswitch\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.865410 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-script-lib\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.867004 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovnkube-config\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.867622 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b06e28-edcf-42e0-b392-7a1bc070f06d-ovn-node-metrics-cert\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:48 crc kubenswrapper[4816]: I0311 12:10:48.881624 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xlxm\" (UniqueName: \"kubernetes.io/projected/37b06e28-edcf-42e0-b392-7a1bc070f06d-kube-api-access-5xlxm\") pod \"ovnkube-node-hhq62\" (UID: \"37b06e28-edcf-42e0-b392-7a1bc070f06d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.011195 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.267048 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dkh2h_8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/ovn-acl-logging/0.log" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268000 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dkh2h_8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/ovn-controller/0.log" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268393 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268430 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268438 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268446 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268452 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268459 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268466 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" exitCode=143 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268472 4816 generic.go:334] "Generic (PLEG): container finished" podID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" exitCode=143 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268521 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268548 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268558 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268577 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268586 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268596 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268605 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268610 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268617 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268623 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268629 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268635 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268640 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268644 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268649 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268654 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268659 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268664 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268670 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268679 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268686 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268692 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268698 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268705 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268712 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268718 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268724 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268730 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" event={"ID":"8fbe3bb6-8bf9-40b5-8f4f-0d136e285528","Type":"ContainerDied","Data":"062d70cdf9dcd40a3c2ebd1f383f192eaa42464d705740bb35123cc3c8899d9b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268745 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268752 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268757 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268763 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268768 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268773 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268778 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268783 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268788 4816 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268801 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.268952 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkh2h" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.283361 4816 generic.go:334] "Generic (PLEG): container finished" podID="37b06e28-edcf-42e0-b392-7a1bc070f06d" containerID="d3c3478146da9a34b25c91a65adac491a71029a1a952a7c80271260c570ded3a" exitCode=0 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.283432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerDied","Data":"d3c3478146da9a34b25c91a65adac491a71029a1a952a7c80271260c570ded3a"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.283463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"29e047aa3cc913eb06b09bb0c8d06dd7acabc7b022d4b8b55f808f2caefbb5c4"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.285335 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mdbt5_a30d3e88-e081-4303-a202-1b7505629539/kube-multus/0.log" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.285406 4816 generic.go:334] "Generic (PLEG): container finished" podID="a30d3e88-e081-4303-a202-1b7505629539" containerID="cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba" exitCode=2 Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.285464 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mdbt5" event={"ID":"a30d3e88-e081-4303-a202-1b7505629539","Type":"ContainerDied","Data":"cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba"} Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.286310 4816 scope.go:117] "RemoveContainer" containerID="cd735f189f4c35270aee2182d3a2eecb596607b294cc97d17559e7b5727e8dba" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.300983 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.305816 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkh2h"] Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.310418 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkh2h"] Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.334425 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.376142 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.401608 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.417024 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.435043 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.455719 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.473768 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.508478 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.509598 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.509649 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.509681 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.510001 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.510030 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.510043 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.510537 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.510554 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.510568 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.511362 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.511397 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.511421 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.512380 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.512412 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.512430 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.512724 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.512779 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.512813 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.513109 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.513133 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} err="failed to get container status \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.513151 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.513360 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.513381 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} err="failed to get container status \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.513397 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: E0311 12:10:49.526907 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.526979 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} err="failed to get container status \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.527016 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.529661 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.529718 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530290 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530314 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530704 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530721 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.530995 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531013 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531297 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531316 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531530 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531548 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531824 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} err="failed to get container status \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.531846 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532079 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} err="failed to get container status \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532095 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532316 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} err="failed to get container status \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532332 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532540 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532558 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532756 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.532768 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.533455 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.533473 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.533774 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.533821 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534107 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534132 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534390 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534409 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534633 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} err="failed to get container status \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534647 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534968 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} err="failed to get container status \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.534988 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535293 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} err="failed to get container status \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535317 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535687 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535705 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535981 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.535999 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.536262 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.536283 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.538555 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.538585 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539002 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539063 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539491 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539535 4816 scope.go:117] "RemoveContainer" containerID="bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539923 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47"} err="failed to get container status \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": rpc error: code = NotFound desc = could not find container \"bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47\": container with ID starting with bef174260e06851687dbedcec11a0599ed9f08d6ba6b3cb4688cae8c7d7f0f47 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.539944 4816 scope.go:117] "RemoveContainer" containerID="ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540237 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b"} err="failed to get container status \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": rpc error: code = NotFound desc = could not find container \"ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b\": container with ID starting with ab86507cece6148bfb5305f3299770111bf6853f86c91b0b376164677a5bf07b not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540284 4816 scope.go:117] "RemoveContainer" containerID="8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540548 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5"} err="failed to get container status \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": rpc error: code = NotFound desc = could not find container \"8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5\": container with ID starting with 8ea27adad9e8f326681e07556be05a267303282a28775b674cbf8574d07fa9c5 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540569 4816 scope.go:117] "RemoveContainer" containerID="6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.540954 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e"} err="failed to get container status \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": rpc error: code = NotFound desc = could not find container \"6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e\": container with ID starting with 6f38c7d36fef29b9ff2fb66adf061d8299226308376044ae9a1c7266973f7c6e not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541053 4816 scope.go:117] "RemoveContainer" containerID="a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541395 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586"} err="failed to get container status \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": rpc error: code = NotFound desc = could not find container \"a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586\": container with ID starting with a40fbd497fbe7fd207a396ce46d9fa0138ef70cde6106865fe205e44e7da4586 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541418 4816 scope.go:117] "RemoveContainer" containerID="45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541698 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63"} err="failed to get container status \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": rpc error: code = NotFound desc = could not find container \"45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63\": container with ID starting with 45f0c6c2f1270ea5adbfbf2927832ab8e0f2c3b6cee69d4300bc67a62b6ccb63 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.541772 4816 scope.go:117] "RemoveContainer" containerID="62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542079 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46"} err="failed to get container status \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": rpc error: code = NotFound desc = could not find container \"62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46\": container with ID starting with 62072aec56439291e25a396ebfbbeefdf9202b0c1c6552d8cd061c6f0871bf46 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542101 4816 scope.go:117] "RemoveContainer" containerID="bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542389 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7"} err="failed to get container status \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": rpc error: code = NotFound desc = could not find container \"bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7\": container with ID starting with bfd19564472b154904d4b9f705584bbe4d35c4bb6baa150d91fa7122b48d46b7 not found: ID does not exist" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542481 4816 scope.go:117] "RemoveContainer" containerID="9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2" Mar 11 12:10:49 crc kubenswrapper[4816]: I0311 12:10:49.542793 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2"} err="failed to get container status \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": rpc error: code = NotFound desc = could not find container \"9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2\": container with ID starting with 9db62837a4792523641b9dc44e1dd9780e9320bc1308fd73d471853ced368ca2 not found: ID does not exist" Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.140512 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbe3bb6-8bf9-40b5-8f4f-0d136e285528" path="/var/lib/kubelet/pods/8fbe3bb6-8bf9-40b5-8f4f-0d136e285528/volumes" Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.300050 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mdbt5_a30d3e88-e081-4303-a202-1b7505629539/kube-multus/0.log" Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.300210 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mdbt5" event={"ID":"a30d3e88-e081-4303-a202-1b7505629539","Type":"ContainerStarted","Data":"4d4d255d20dc4eee3b47010d5f77933f5ae0bf035b74f040a7ea1d371bea82d5"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305451 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"9737746047f66b88abbbbdc3e85f0bb5e80305c21ae69d4acf8709eaed03e483"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305481 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"afcba8530bd65f111d0abc28f8fe448dad135747684fb008363c43368a57a5a3"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305513 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"66cba12be44bdddf5b48d542c633822cfa39f6f6d09a7e4ee54d4fcf181fa63d"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"54b18462010b3fbe4d11a8e256bad80b1c6d1fdc265b08f1e48f0874413543aa"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305535 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"d27f9f31ccafad332da05428465c1af6015204ffb5d97aa20dbfdfe1c590d017"} Mar 11 12:10:50 crc kubenswrapper[4816]: I0311 12:10:50.305544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"18c3ed790a7d28d7dc1850b94b9c3bbe9e1f81d84eab099b53d6bf1aad414c53"} Mar 11 12:10:52 crc kubenswrapper[4816]: I0311 12:10:52.329863 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"8b1d050c2e528cf69a04cd8dac85eefe5e81a78c693234353db0f29272f52c47"} Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.937604 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.939203 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.941744 4816 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-zmgc9" Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.941783 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.941926 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 12:10:53 crc kubenswrapper[4816]: I0311 12:10:53.942290 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.048932 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.048981 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.049027 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.149801 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.149935 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.149984 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.150227 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.150995 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.173928 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") pod \"crc-storage-crc-69hv5\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: I0311 12:10:54.262750 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: E0311 12:10:54.303621 4816 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(82b8e2422033b55e0e30d21316f885f5b691a6fcebc8a4745a9f45991a879231): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 12:10:54 crc kubenswrapper[4816]: E0311 12:10:54.303793 4816 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(82b8e2422033b55e0e30d21316f885f5b691a6fcebc8a4745a9f45991a879231): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: E0311 12:10:54.303845 4816 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(82b8e2422033b55e0e30d21316f885f5b691a6fcebc8a4745a9f45991a879231): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:54 crc kubenswrapper[4816]: E0311 12:10:54.304039 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-69hv5_crc-storage(7462073d-1852-4032-87bc-e0a4b973f92f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-69hv5_crc-storage(7462073d-1852-4032-87bc-e0a4b973f92f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(82b8e2422033b55e0e30d21316f885f5b691a6fcebc8a4745a9f45991a879231): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-69hv5" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.366279 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" event={"ID":"37b06e28-edcf-42e0-b392-7a1bc070f06d","Type":"ContainerStarted","Data":"9f1b1c7e6daeeba219bc9ee757df3614a1d23c2e5b414924e156964abdb003ac"} Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.367002 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.367043 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.367061 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.404534 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.408157 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.417092 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" podStartSLOduration=7.417065085 podStartE2EDuration="7.417065085s" podCreationTimestamp="2026-03-11 12:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:10:55.409013043 +0000 UTC m=+742.000277050" watchObservedRunningTime="2026-03-11 12:10:55.417065085 +0000 UTC m=+742.008329062" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.474225 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.474385 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:55 crc kubenswrapper[4816]: I0311 12:10:55.474824 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:55 crc kubenswrapper[4816]: E0311 12:10:55.502061 4816 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(d3e8a72107756c6a010a0556a446d446f255b7b50caf8d168e9fd1eb54845a1a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 12:10:55 crc kubenswrapper[4816]: E0311 12:10:55.502623 4816 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(d3e8a72107756c6a010a0556a446d446f255b7b50caf8d168e9fd1eb54845a1a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:55 crc kubenswrapper[4816]: E0311 12:10:55.502655 4816 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(d3e8a72107756c6a010a0556a446d446f255b7b50caf8d168e9fd1eb54845a1a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:10:55 crc kubenswrapper[4816]: E0311 12:10:55.502737 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-69hv5_crc-storage(7462073d-1852-4032-87bc-e0a4b973f92f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-69hv5_crc-storage(7462073d-1852-4032-87bc-e0a4b973f92f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-69hv5_crc-storage_7462073d-1852-4032-87bc-e0a4b973f92f_0(d3e8a72107756c6a010a0556a446d446f255b7b50caf8d168e9fd1eb54845a1a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-69hv5" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.130048 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.131446 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.350889 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.357099 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:11:07 crc kubenswrapper[4816]: I0311 12:11:07.472693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-69hv5" event={"ID":"7462073d-1852-4032-87bc-e0a4b973f92f","Type":"ContainerStarted","Data":"521bfddd61bb4df175bc1fb7dfc68b6912550738c88aba8b5a47a1a00c59a39f"} Mar 11 12:11:09 crc kubenswrapper[4816]: I0311 12:11:09.486718 4816 generic.go:334] "Generic (PLEG): container finished" podID="7462073d-1852-4032-87bc-e0a4b973f92f" containerID="0ee4f053b0c8963adb31e4e6ffaf9c7c100dafccbfa493c26f5254141c13917c" exitCode=0 Mar 11 12:11:09 crc kubenswrapper[4816]: I0311 12:11:09.486773 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-69hv5" event={"ID":"7462073d-1852-4032-87bc-e0a4b973f92f","Type":"ContainerDied","Data":"0ee4f053b0c8963adb31e4e6ffaf9c7c100dafccbfa493c26f5254141c13917c"} Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.753565 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.798039 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") pod \"7462073d-1852-4032-87bc-e0a4b973f92f\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.798499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") pod \"7462073d-1852-4032-87bc-e0a4b973f92f\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.798581 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") pod \"7462073d-1852-4032-87bc-e0a4b973f92f\" (UID: \"7462073d-1852-4032-87bc-e0a4b973f92f\") " Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.798880 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7462073d-1852-4032-87bc-e0a4b973f92f" (UID: "7462073d-1852-4032-87bc-e0a4b973f92f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.803384 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8" (OuterVolumeSpecName: "kube-api-access-zp8n8") pod "7462073d-1852-4032-87bc-e0a4b973f92f" (UID: "7462073d-1852-4032-87bc-e0a4b973f92f"). InnerVolumeSpecName "kube-api-access-zp8n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.810163 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7462073d-1852-4032-87bc-e0a4b973f92f" (UID: "7462073d-1852-4032-87bc-e0a4b973f92f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.900180 4816 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7462073d-1852-4032-87bc-e0a4b973f92f-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.900227 4816 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7462073d-1852-4032-87bc-e0a4b973f92f-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:10 crc kubenswrapper[4816]: I0311 12:11:10.900240 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp8n8\" (UniqueName: \"kubernetes.io/projected/7462073d-1852-4032-87bc-e0a4b973f92f-kube-api-access-zp8n8\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:11 crc kubenswrapper[4816]: I0311 12:11:11.503721 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-69hv5" event={"ID":"7462073d-1852-4032-87bc-e0a4b973f92f","Type":"ContainerDied","Data":"521bfddd61bb4df175bc1fb7dfc68b6912550738c88aba8b5a47a1a00c59a39f"} Mar 11 12:11:11 crc kubenswrapper[4816]: I0311 12:11:11.503781 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="521bfddd61bb4df175bc1fb7dfc68b6912550738c88aba8b5a47a1a00c59a39f" Mar 11 12:11:11 crc kubenswrapper[4816]: I0311 12:11:11.503849 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-69hv5" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.877319 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp"] Mar 11 12:11:18 crc kubenswrapper[4816]: E0311 12:11:18.878061 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" containerName="storage" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.878078 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" containerName="storage" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.878212 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" containerName="storage" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.879241 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.882172 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.883773 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp"] Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.903412 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.903553 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:18 crc kubenswrapper[4816]: I0311 12:11:18.903962 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.004484 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.004542 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.004586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.005063 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.005132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.032449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hhq62" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.032716 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.198703 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:19 crc kubenswrapper[4816]: I0311 12:11:19.620030 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp"] Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.567305 4816 generic.go:334] "Generic (PLEG): container finished" podID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerID="01ba9569641c593f0a0de55cac3b2f8df054eba1c3c74eeda2593404e7f454af" exitCode=0 Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.567398 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerDied","Data":"01ba9569641c593f0a0de55cac3b2f8df054eba1c3c74eeda2593404e7f454af"} Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.567774 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerStarted","Data":"fe0c8bc583ce4621ae0ee36bf084ea55f1012d84cd0125dadd104db67728e406"} Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.899117 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.904300 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.915769 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.930737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.930810 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:20 crc kubenswrapper[4816]: I0311 12:11:20.930861 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.031691 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.031764 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.031842 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.032384 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.032416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.068581 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") pod \"redhat-operators-c29dr\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:21 crc kubenswrapper[4816]: I0311 12:11:21.231764 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.326397 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.580090 4816 generic.go:334] "Generic (PLEG): container finished" podID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerID="003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3" exitCode=0 Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.580161 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerDied","Data":"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3"} Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.580195 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerStarted","Data":"f55816d844fceb14f1b2e972a2c499707de36af2a78a8f7ba4306a161a99924b"} Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.582631 4816 generic.go:334] "Generic (PLEG): container finished" podID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerID="b1b201d6c2d8d849456a7f9a96ac6a34ca3bbe4fc702ab0e66e8390510c6f970" exitCode=0 Mar 11 12:11:22 crc kubenswrapper[4816]: I0311 12:11:22.582661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerDied","Data":"b1b201d6c2d8d849456a7f9a96ac6a34ca3bbe4fc702ab0e66e8390510c6f970"} Mar 11 12:11:23 crc kubenswrapper[4816]: I0311 12:11:23.589660 4816 generic.go:334] "Generic (PLEG): container finished" podID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerID="a03404bb8b9128b37e1393dcbcd23ba56c893a0ec7a237e8c2ff7880bcb37b34" exitCode=0 Mar 11 12:11:23 crc kubenswrapper[4816]: I0311 12:11:23.589717 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerDied","Data":"a03404bb8b9128b37e1393dcbcd23ba56c893a0ec7a237e8c2ff7880bcb37b34"} Mar 11 12:11:23 crc kubenswrapper[4816]: I0311 12:11:23.592208 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerStarted","Data":"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a"} Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.603113 4816 generic.go:334] "Generic (PLEG): container finished" podID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerID="4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a" exitCode=0 Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.603344 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerDied","Data":"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a"} Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.948289 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.994590 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") pod \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.994657 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") pod \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.994707 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") pod \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\" (UID: \"2a03942f-8b0e-4041-8843-ad5e6cedc6b0\") " Mar 11 12:11:24 crc kubenswrapper[4816]: I0311 12:11:24.995740 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle" (OuterVolumeSpecName: "bundle") pod "2a03942f-8b0e-4041-8843-ad5e6cedc6b0" (UID: "2a03942f-8b0e-4041-8843-ad5e6cedc6b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.003648 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7" (OuterVolumeSpecName: "kube-api-access-4jjk7") pod "2a03942f-8b0e-4041-8843-ad5e6cedc6b0" (UID: "2a03942f-8b0e-4041-8843-ad5e6cedc6b0"). InnerVolumeSpecName "kube-api-access-4jjk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.097139 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jjk7\" (UniqueName: \"kubernetes.io/projected/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-kube-api-access-4jjk7\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.097168 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.256010 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util" (OuterVolumeSpecName: "util") pod "2a03942f-8b0e-4041-8843-ad5e6cedc6b0" (UID: "2a03942f-8b0e-4041-8843-ad5e6cedc6b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.300211 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a03942f-8b0e-4041-8843-ad5e6cedc6b0-util\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.616848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" event={"ID":"2a03942f-8b0e-4041-8843-ad5e6cedc6b0","Type":"ContainerDied","Data":"fe0c8bc583ce4621ae0ee36bf084ea55f1012d84cd0125dadd104db67728e406"} Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.616904 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe0c8bc583ce4621ae0ee36bf084ea55f1012d84cd0125dadd104db67728e406" Mar 11 12:11:25 crc kubenswrapper[4816]: I0311 12:11:25.616938 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp" Mar 11 12:11:26 crc kubenswrapper[4816]: I0311 12:11:26.629089 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerStarted","Data":"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9"} Mar 11 12:11:26 crc kubenswrapper[4816]: I0311 12:11:26.660608 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c29dr" podStartSLOduration=3.642467768 podStartE2EDuration="6.660576774s" podCreationTimestamp="2026-03-11 12:11:20 +0000 UTC" firstStartedPulling="2026-03-11 12:11:22.58141548 +0000 UTC m=+769.172679457" lastFinishedPulling="2026-03-11 12:11:25.599524456 +0000 UTC m=+772.190788463" observedRunningTime="2026-03-11 12:11:26.653413108 +0000 UTC m=+773.244677155" watchObservedRunningTime="2026-03-11 12:11:26.660576774 +0000 UTC m=+773.251840781" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.240361 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-g59xq"] Mar 11 12:11:29 crc kubenswrapper[4816]: E0311 12:11:29.240926 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="extract" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.240940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="extract" Mar 11 12:11:29 crc kubenswrapper[4816]: E0311 12:11:29.240960 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="pull" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.240966 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="pull" Mar 11 12:11:29 crc kubenswrapper[4816]: E0311 12:11:29.240977 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="util" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.240983 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="util" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.241088 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a03942f-8b0e-4041-8843-ad5e6cedc6b0" containerName="extract" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.241478 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.243795 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-w7sf4" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.250434 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.251545 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.258701 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-g59xq"] Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.354740 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldlnf\" (UniqueName: \"kubernetes.io/projected/c1f09ebe-c0e1-415c-9ea9-42fc42240e94-kube-api-access-ldlnf\") pod \"nmstate-operator-796d4cfff4-g59xq\" (UID: \"c1f09ebe-c0e1-415c-9ea9-42fc42240e94\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.456107 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlnf\" (UniqueName: \"kubernetes.io/projected/c1f09ebe-c0e1-415c-9ea9-42fc42240e94-kube-api-access-ldlnf\") pod \"nmstate-operator-796d4cfff4-g59xq\" (UID: \"c1f09ebe-c0e1-415c-9ea9-42fc42240e94\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.480359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldlnf\" (UniqueName: \"kubernetes.io/projected/c1f09ebe-c0e1-415c-9ea9-42fc42240e94-kube-api-access-ldlnf\") pod \"nmstate-operator-796d4cfff4-g59xq\" (UID: \"c1f09ebe-c0e1-415c-9ea9-42fc42240e94\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.559879 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" Mar 11 12:11:29 crc kubenswrapper[4816]: I0311 12:11:29.892898 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-g59xq"] Mar 11 12:11:30 crc kubenswrapper[4816]: I0311 12:11:30.653858 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" event={"ID":"c1f09ebe-c0e1-415c-9ea9-42fc42240e94","Type":"ContainerStarted","Data":"368c9f058bcf11df278bc0ef15c7f90449d2b78f39b6a1fe4f8d949c96b6155a"} Mar 11 12:11:31 crc kubenswrapper[4816]: I0311 12:11:31.232676 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:31 crc kubenswrapper[4816]: I0311 12:11:31.233068 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:32 crc kubenswrapper[4816]: I0311 12:11:32.293966 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c29dr" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" probeResult="failure" output=< Mar 11 12:11:32 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:11:32 crc kubenswrapper[4816]: > Mar 11 12:11:33 crc kubenswrapper[4816]: I0311 12:11:33.698402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" event={"ID":"c1f09ebe-c0e1-415c-9ea9-42fc42240e94","Type":"ContainerStarted","Data":"154082c663c85dd386ba1f00d26dbf18b99c080627086950c172f7b5f6ec450a"} Mar 11 12:11:33 crc kubenswrapper[4816]: I0311 12:11:33.728973 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-g59xq" podStartSLOduration=1.275402976 podStartE2EDuration="4.72894315s" podCreationTimestamp="2026-03-11 12:11:29 +0000 UTC" firstStartedPulling="2026-03-11 12:11:29.902572693 +0000 UTC m=+776.493836660" lastFinishedPulling="2026-03-11 12:11:33.356112867 +0000 UTC m=+779.947376834" observedRunningTime="2026-03-11 12:11:33.725011658 +0000 UTC m=+780.316275625" watchObservedRunningTime="2026-03-11 12:11:33.72894315 +0000 UTC m=+780.320207127" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.888422 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd"] Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.890071 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.895144 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-v2wv2" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.900128 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xq48v"] Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.900969 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.903361 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.914462 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xq48v"] Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.931179 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-47rs2"] Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.932121 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:38 crc kubenswrapper[4816]: I0311 12:11:38.953553 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.019645 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b664fad-a0fa-4442-bed2-3316eafbb78c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.019685 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4fls\" (UniqueName: \"kubernetes.io/projected/7fb0dcd0-9411-49d6-a997-79d2099b2462-kube-api-access-m4fls\") pod \"nmstate-metrics-9b8c8685d-2snpd\" (UID: \"7fb0dcd0-9411-49d6-a997-79d2099b2462\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.019717 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj6pt\" (UniqueName: \"kubernetes.io/projected/1b664fad-a0fa-4442-bed2-3316eafbb78c-kube-api-access-lj6pt\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.034681 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.035454 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.037524 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.042004 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.042629 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mpl9k" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.051564 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.120911 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-ovs-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b958v\" (UniqueName: \"kubernetes.io/projected/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-kube-api-access-b958v\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121065 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b664fad-a0fa-4442-bed2-3316eafbb78c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121083 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4fls\" (UniqueName: \"kubernetes.io/projected/7fb0dcd0-9411-49d6-a997-79d2099b2462-kube-api-access-m4fls\") pod \"nmstate-metrics-9b8c8685d-2snpd\" (UID: \"7fb0dcd0-9411-49d6-a997-79d2099b2462\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121108 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-nmstate-lock\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121126 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-dbus-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.121142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj6pt\" (UniqueName: \"kubernetes.io/projected/1b664fad-a0fa-4442-bed2-3316eafbb78c-kube-api-access-lj6pt\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.127664 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b664fad-a0fa-4442-bed2-3316eafbb78c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.143441 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj6pt\" (UniqueName: \"kubernetes.io/projected/1b664fad-a0fa-4442-bed2-3316eafbb78c-kube-api-access-lj6pt\") pod \"nmstate-webhook-5f558f5558-xq48v\" (UID: \"1b664fad-a0fa-4442-bed2-3316eafbb78c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.144129 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4fls\" (UniqueName: \"kubernetes.io/projected/7fb0dcd0-9411-49d6-a997-79d2099b2462-kube-api-access-m4fls\") pod \"nmstate-metrics-9b8c8685d-2snpd\" (UID: \"7fb0dcd0-9411-49d6-a997-79d2099b2462\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.213128 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.219770 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd5b84cfd-qgdq4"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.220826 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.221685 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222401 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a822f6ee-e723-4f64-b4f6-c948dc948359-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222494 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a822f6ee-e723-4f64-b4f6-c948dc948359-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222546 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-nmstate-lock\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222579 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-dbus-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222633 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vclh8\" (UniqueName: \"kubernetes.io/projected/a822f6ee-e723-4f64-b4f6-c948dc948359-kube-api-access-vclh8\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222686 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-ovs-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.222722 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b958v\" (UniqueName: \"kubernetes.io/projected/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-kube-api-access-b958v\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.223303 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-nmstate-lock\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.223560 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-ovs-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.223671 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-dbus-socket\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.236409 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd5b84cfd-qgdq4"] Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.274480 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b958v\" (UniqueName: \"kubernetes.io/projected/fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9-kube-api-access-b958v\") pod \"nmstate-handler-47rs2\" (UID: \"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9\") " pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vclh8\" (UniqueName: \"kubernetes.io/projected/a822f6ee-e723-4f64-b4f6-c948dc948359-kube-api-access-vclh8\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324773 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-trusted-ca-bundle\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324827 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-service-ca\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324914 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-console-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324948 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a822f6ee-e723-4f64-b4f6-c948dc948359-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.324970 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-oauth-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.325025 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a822f6ee-e723-4f64-b4f6-c948dc948359-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.325061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbr5\" (UniqueName: \"kubernetes.io/projected/2faec9d1-9173-4181-b887-2a375426ff16-kube-api-access-xrbr5\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.325115 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-oauth-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.327273 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a822f6ee-e723-4f64-b4f6-c948dc948359-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.336223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a822f6ee-e723-4f64-b4f6-c948dc948359-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.348952 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vclh8\" (UniqueName: \"kubernetes.io/projected/a822f6ee-e723-4f64-b4f6-c948dc948359-kube-api-access-vclh8\") pod \"nmstate-console-plugin-86f58fcf4-px2gk\" (UID: \"a822f6ee-e723-4f64-b4f6-c948dc948359\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.364696 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426486 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbr5\" (UniqueName: \"kubernetes.io/projected/2faec9d1-9173-4181-b887-2a375426ff16-kube-api-access-xrbr5\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426553 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-oauth-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-trusted-ca-bundle\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426639 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-service-ca\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426676 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-console-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.426701 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-oauth-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.428864 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-service-ca\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.428935 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-console-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.429522 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-trusted-ca-bundle\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.430857 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2faec9d1-9173-4181-b887-2a375426ff16-oauth-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.431568 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-oauth-config\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.438702 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2faec9d1-9173-4181-b887-2a375426ff16-console-serving-cert\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.444351 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbr5\" (UniqueName: \"kubernetes.io/projected/2faec9d1-9173-4181-b887-2a375426ff16-kube-api-access-xrbr5\") pod \"console-7cd5b84cfd-qgdq4\" (UID: \"2faec9d1-9173-4181-b887-2a375426ff16\") " pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.514959 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.515016 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.545804 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.598022 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.616444 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xq48v"] Mar 11 12:11:39 crc kubenswrapper[4816]: W0311 12:11:39.628060 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b664fad_a0fa_4442_bed2_3316eafbb78c.slice/crio-08d9d109ad0671f8142b1276a83b73e1da951251757dc5caf16e8cec867fd7ce WatchSource:0}: Error finding container 08d9d109ad0671f8142b1276a83b73e1da951251757dc5caf16e8cec867fd7ce: Status 404 returned error can't find the container with id 08d9d109ad0671f8142b1276a83b73e1da951251757dc5caf16e8cec867fd7ce Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.658083 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk"] Mar 11 12:11:39 crc kubenswrapper[4816]: W0311 12:11:39.663347 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda822f6ee_e723_4f64_b4f6_c948dc948359.slice/crio-80e76120d350df905ae611a86507796cc0bc43f29afe54356e1793a2471dfaa2 WatchSource:0}: Error finding container 80e76120d350df905ae611a86507796cc0bc43f29afe54356e1793a2471dfaa2: Status 404 returned error can't find the container with id 80e76120d350df905ae611a86507796cc0bc43f29afe54356e1793a2471dfaa2 Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.731977 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd"] Mar 11 12:11:39 crc kubenswrapper[4816]: W0311 12:11:39.734091 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fb0dcd0_9411_49d6_a997_79d2099b2462.slice/crio-6819cb7b52ed5b74b408ed83c83d8239d6c4295a20a0ad70529fb675bb94dbeb WatchSource:0}: Error finding container 6819cb7b52ed5b74b408ed83c83d8239d6c4295a20a0ad70529fb675bb94dbeb: Status 404 returned error can't find the container with id 6819cb7b52ed5b74b408ed83c83d8239d6c4295a20a0ad70529fb675bb94dbeb Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.741360 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" event={"ID":"a822f6ee-e723-4f64-b4f6-c948dc948359","Type":"ContainerStarted","Data":"80e76120d350df905ae611a86507796cc0bc43f29afe54356e1793a2471dfaa2"} Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.742568 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-47rs2" event={"ID":"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9","Type":"ContainerStarted","Data":"dc13bf644351194920c39fd044bd74e79de536885e38c34e70b730aa1bdb3a6c"} Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.743611 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" event={"ID":"1b664fad-a0fa-4442-bed2-3316eafbb78c","Type":"ContainerStarted","Data":"08d9d109ad0671f8142b1276a83b73e1da951251757dc5caf16e8cec867fd7ce"} Mar 11 12:11:39 crc kubenswrapper[4816]: I0311 12:11:39.793281 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd5b84cfd-qgdq4"] Mar 11 12:11:39 crc kubenswrapper[4816]: W0311 12:11:39.802024 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2faec9d1_9173_4181_b887_2a375426ff16.slice/crio-067e7f4c96c162d9c49f51783a6d70a031538a88a941196258da04be5bd4d5dc WatchSource:0}: Error finding container 067e7f4c96c162d9c49f51783a6d70a031538a88a941196258da04be5bd4d5dc: Status 404 returned error can't find the container with id 067e7f4c96c162d9c49f51783a6d70a031538a88a941196258da04be5bd4d5dc Mar 11 12:11:40 crc kubenswrapper[4816]: I0311 12:11:40.750563 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" event={"ID":"7fb0dcd0-9411-49d6-a997-79d2099b2462","Type":"ContainerStarted","Data":"6819cb7b52ed5b74b408ed83c83d8239d6c4295a20a0ad70529fb675bb94dbeb"} Mar 11 12:11:40 crc kubenswrapper[4816]: I0311 12:11:40.753949 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd5b84cfd-qgdq4" event={"ID":"2faec9d1-9173-4181-b887-2a375426ff16","Type":"ContainerStarted","Data":"ee6d9f77d570cb657bb9b8a6ce31de1bb71f653dfece0faa1822af2c1f6108dc"} Mar 11 12:11:40 crc kubenswrapper[4816]: I0311 12:11:40.754027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd5b84cfd-qgdq4" event={"ID":"2faec9d1-9173-4181-b887-2a375426ff16","Type":"ContainerStarted","Data":"067e7f4c96c162d9c49f51783a6d70a031538a88a941196258da04be5bd4d5dc"} Mar 11 12:11:40 crc kubenswrapper[4816]: I0311 12:11:40.775240 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd5b84cfd-qgdq4" podStartSLOduration=1.775222847 podStartE2EDuration="1.775222847s" podCreationTimestamp="2026-03-11 12:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:11:40.774503527 +0000 UTC m=+787.365767494" watchObservedRunningTime="2026-03-11 12:11:40.775222847 +0000 UTC m=+787.366486814" Mar 11 12:11:41 crc kubenswrapper[4816]: I0311 12:11:41.287901 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:41 crc kubenswrapper[4816]: I0311 12:11:41.358796 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:41 crc kubenswrapper[4816]: I0311 12:11:41.519658 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:42 crc kubenswrapper[4816]: I0311 12:11:42.770535 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c29dr" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" containerID="cri-o://95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" gracePeriod=2 Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.314421 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.489311 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") pod \"461ff7ea-1256-4928-9425-ed9840dc4eda\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.489829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") pod \"461ff7ea-1256-4928-9425-ed9840dc4eda\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.490019 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") pod \"461ff7ea-1256-4928-9425-ed9840dc4eda\" (UID: \"461ff7ea-1256-4928-9425-ed9840dc4eda\") " Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.490366 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities" (OuterVolumeSpecName: "utilities") pod "461ff7ea-1256-4928-9425-ed9840dc4eda" (UID: "461ff7ea-1256-4928-9425-ed9840dc4eda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.496271 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq" (OuterVolumeSpecName: "kube-api-access-kqbqq") pod "461ff7ea-1256-4928-9425-ed9840dc4eda" (UID: "461ff7ea-1256-4928-9425-ed9840dc4eda"). InnerVolumeSpecName "kube-api-access-kqbqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.591368 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqbqq\" (UniqueName: \"kubernetes.io/projected/461ff7ea-1256-4928-9425-ed9840dc4eda-kube-api-access-kqbqq\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.592172 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.612332 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "461ff7ea-1256-4928-9425-ed9840dc4eda" (UID: "461ff7ea-1256-4928-9425-ed9840dc4eda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.693980 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461ff7ea-1256-4928-9425-ed9840dc4eda-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.779137 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" event={"ID":"a822f6ee-e723-4f64-b4f6-c948dc948359","Type":"ContainerStarted","Data":"581df34bdea0a6240db949e261a44b234e0c12a9ea1a278bee32fb8c46da765e"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.782069 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" event={"ID":"7fb0dcd0-9411-49d6-a997-79d2099b2462","Type":"ContainerStarted","Data":"1b2ce29c545c5e80c863bcc3ec847584bc5a65bfc8d54215b0f0739a53d5e3e1"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785287 4816 generic.go:334] "Generic (PLEG): container finished" podID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerID="95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" exitCode=0 Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785382 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c29dr" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785516 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerDied","Data":"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785740 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c29dr" event={"ID":"461ff7ea-1256-4928-9425-ed9840dc4eda","Type":"ContainerDied","Data":"f55816d844fceb14f1b2e972a2c499707de36af2a78a8f7ba4306a161a99924b"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.785781 4816 scope.go:117] "RemoveContainer" containerID="95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.787548 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-47rs2" event={"ID":"fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9","Type":"ContainerStarted","Data":"ab21c0dd78bbf18d58501846c858efb5e3c52c65baf38bbd2a2a28708397c4cf"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.787740 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.791764 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" event={"ID":"1b664fad-a0fa-4442-bed2-3316eafbb78c","Type":"ContainerStarted","Data":"0acffc922382d3b43195460b16c202cccbefd81ad991cb4fa6c1422bdd5412eb"} Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.792587 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.802113 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-px2gk" podStartSLOduration=1.334577114 podStartE2EDuration="4.802096226s" podCreationTimestamp="2026-03-11 12:11:39 +0000 UTC" firstStartedPulling="2026-03-11 12:11:39.666070001 +0000 UTC m=+786.257333968" lastFinishedPulling="2026-03-11 12:11:43.133589103 +0000 UTC m=+789.724853080" observedRunningTime="2026-03-11 12:11:43.799962356 +0000 UTC m=+790.391226313" watchObservedRunningTime="2026-03-11 12:11:43.802096226 +0000 UTC m=+790.393360203" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.819041 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-47rs2" podStartSLOduration=2.219522233 podStartE2EDuration="5.819019358s" podCreationTimestamp="2026-03-11 12:11:38 +0000 UTC" firstStartedPulling="2026-03-11 12:11:39.568597219 +0000 UTC m=+786.159861186" lastFinishedPulling="2026-03-11 12:11:43.168094334 +0000 UTC m=+789.759358311" observedRunningTime="2026-03-11 12:11:43.818487603 +0000 UTC m=+790.409751570" watchObservedRunningTime="2026-03-11 12:11:43.819019358 +0000 UTC m=+790.410283325" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.824445 4816 scope.go:117] "RemoveContainer" containerID="4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.843863 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" podStartSLOduration=2.340690259 podStartE2EDuration="5.843844874s" podCreationTimestamp="2026-03-11 12:11:38 +0000 UTC" firstStartedPulling="2026-03-11 12:11:39.631761515 +0000 UTC m=+786.223025482" lastFinishedPulling="2026-03-11 12:11:43.13491613 +0000 UTC m=+789.726180097" observedRunningTime="2026-03-11 12:11:43.83809101 +0000 UTC m=+790.429354977" watchObservedRunningTime="2026-03-11 12:11:43.843844874 +0000 UTC m=+790.435108841" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.852760 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.860033 4816 scope.go:117] "RemoveContainer" containerID="003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.873225 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c29dr"] Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.881049 4816 scope.go:117] "RemoveContainer" containerID="95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" Mar 11 12:11:43 crc kubenswrapper[4816]: E0311 12:11:43.881624 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9\": container with ID starting with 95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9 not found: ID does not exist" containerID="95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.881663 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9"} err="failed to get container status \"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9\": rpc error: code = NotFound desc = could not find container \"95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9\": container with ID starting with 95dca1c91ddd21a7baf580c3a3e2aeb2e82668e3bf4d222da6549241684ccef9 not found: ID does not exist" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.881694 4816 scope.go:117] "RemoveContainer" containerID="4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a" Mar 11 12:11:43 crc kubenswrapper[4816]: E0311 12:11:43.882101 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a\": container with ID starting with 4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a not found: ID does not exist" containerID="4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.882148 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a"} err="failed to get container status \"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a\": rpc error: code = NotFound desc = could not find container \"4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a\": container with ID starting with 4f13015aaf50ff3e2d23b9c5a185d266297979cbb0afdf2142cff44ba6662d5a not found: ID does not exist" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.882174 4816 scope.go:117] "RemoveContainer" containerID="003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3" Mar 11 12:11:43 crc kubenswrapper[4816]: E0311 12:11:43.882636 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3\": container with ID starting with 003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3 not found: ID does not exist" containerID="003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3" Mar 11 12:11:43 crc kubenswrapper[4816]: I0311 12:11:43.882661 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3"} err="failed to get container status \"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3\": rpc error: code = NotFound desc = could not find container \"003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3\": container with ID starting with 003ac48122b295d5bfa357b7ac23c84d4d289972e7886ac49dd30b9a43ab8fd3 not found: ID does not exist" Mar 11 12:11:44 crc kubenswrapper[4816]: I0311 12:11:44.153060 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" path="/var/lib/kubelet/pods/461ff7ea-1256-4928-9425-ed9840dc4eda/volumes" Mar 11 12:11:46 crc kubenswrapper[4816]: I0311 12:11:46.817216 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" event={"ID":"7fb0dcd0-9411-49d6-a997-79d2099b2462","Type":"ContainerStarted","Data":"ec87d8bb7f0f902050146f6c79fc3f469dea06afd48547bc35ede11e2e3f4512"} Mar 11 12:11:46 crc kubenswrapper[4816]: I0311 12:11:46.866498 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2snpd" podStartSLOduration=2.221221 podStartE2EDuration="8.866452081s" podCreationTimestamp="2026-03-11 12:11:38 +0000 UTC" firstStartedPulling="2026-03-11 12:11:39.735388852 +0000 UTC m=+786.326652819" lastFinishedPulling="2026-03-11 12:11:46.380619933 +0000 UTC m=+792.971883900" observedRunningTime="2026-03-11 12:11:46.843270812 +0000 UTC m=+793.434534779" watchObservedRunningTime="2026-03-11 12:11:46.866452081 +0000 UTC m=+793.457716098" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.582984 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-47rs2" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.600636 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.600750 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.610456 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.847889 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd5b84cfd-qgdq4" Mar 11 12:11:49 crc kubenswrapper[4816]: I0311 12:11:49.925059 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:11:59 crc kubenswrapper[4816]: I0311 12:11:59.232631 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xq48v" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.127851 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:12:00 crc kubenswrapper[4816]: E0311 12:12:00.128110 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="extract-utilities" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128123 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="extract-utilities" Mar 11 12:12:00 crc kubenswrapper[4816]: E0311 12:12:00.128140 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128146 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" Mar 11 12:12:00 crc kubenswrapper[4816]: E0311 12:12:00.128154 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="extract-content" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128161 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="extract-content" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128285 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="461ff7ea-1256-4928-9425-ed9840dc4eda" containerName="registry-server" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.128683 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.131566 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.131739 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.135550 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") pod \"auto-csr-approver-29553852-tvtrs\" (UID: \"f1d15245-e206-4f60-a05c-9888a45a1aca\") " pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.135593 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.150173 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.237239 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") pod \"auto-csr-approver-29553852-tvtrs\" (UID: \"f1d15245-e206-4f60-a05c-9888a45a1aca\") " pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.260381 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") pod \"auto-csr-approver-29553852-tvtrs\" (UID: \"f1d15245-e206-4f60-a05c-9888a45a1aca\") " pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.447901 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.655771 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:12:00 crc kubenswrapper[4816]: W0311 12:12:00.662079 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1d15245_e206_4f60_a05c_9888a45a1aca.slice/crio-9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636 WatchSource:0}: Error finding container 9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636: Status 404 returned error can't find the container with id 9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636 Mar 11 12:12:00 crc kubenswrapper[4816]: I0311 12:12:00.919136 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" event={"ID":"f1d15245-e206-4f60-a05c-9888a45a1aca","Type":"ContainerStarted","Data":"9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636"} Mar 11 12:12:02 crc kubenswrapper[4816]: I0311 12:12:02.931666 4816 generic.go:334] "Generic (PLEG): container finished" podID="f1d15245-e206-4f60-a05c-9888a45a1aca" containerID="258e8c83fc2dd9e9c165c147a3085d310c4de5d771038f237098b4b3a09178a8" exitCode=0 Mar 11 12:12:02 crc kubenswrapper[4816]: I0311 12:12:02.931814 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" event={"ID":"f1d15245-e206-4f60-a05c-9888a45a1aca","Type":"ContainerDied","Data":"258e8c83fc2dd9e9c165c147a3085d310c4de5d771038f237098b4b3a09178a8"} Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.159872 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.294184 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") pod \"f1d15245-e206-4f60-a05c-9888a45a1aca\" (UID: \"f1d15245-e206-4f60-a05c-9888a45a1aca\") " Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.299585 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9" (OuterVolumeSpecName: "kube-api-access-6xbn9") pod "f1d15245-e206-4f60-a05c-9888a45a1aca" (UID: "f1d15245-e206-4f60-a05c-9888a45a1aca"). InnerVolumeSpecName "kube-api-access-6xbn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.396772 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xbn9\" (UniqueName: \"kubernetes.io/projected/f1d15245-e206-4f60-a05c-9888a45a1aca-kube-api-access-6xbn9\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.947221 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" event={"ID":"f1d15245-e206-4f60-a05c-9888a45a1aca","Type":"ContainerDied","Data":"9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636"} Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.947315 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd0366cac3cfbf4773d8fb37507e0ae9de9dd8d399a8112b07e635776aac636" Mar 11 12:12:04 crc kubenswrapper[4816]: I0311 12:12:04.947381 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553852-tvtrs" Mar 11 12:12:05 crc kubenswrapper[4816]: I0311 12:12:05.234216 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:12:05 crc kubenswrapper[4816]: I0311 12:12:05.241299 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553846-v5vlq"] Mar 11 12:12:06 crc kubenswrapper[4816]: I0311 12:12:06.142194 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8860b8d2-719a-4930-9df3-d0bc14d8de19" path="/var/lib/kubelet/pods/8860b8d2-719a-4930-9df3-d0bc14d8de19/volumes" Mar 11 12:12:09 crc kubenswrapper[4816]: I0311 12:12:09.515391 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:12:09 crc kubenswrapper[4816]: I0311 12:12:09.516550 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.090171 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8"] Mar 11 12:12:12 crc kubenswrapper[4816]: E0311 12:12:12.091678 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d15245-e206-4f60-a05c-9888a45a1aca" containerName="oc" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.091698 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d15245-e206-4f60-a05c-9888a45a1aca" containerName="oc" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.091844 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d15245-e206-4f60-a05c-9888a45a1aca" containerName="oc" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.093169 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.097728 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.101018 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8"] Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.111951 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.111997 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.112041 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.212940 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.213019 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.213145 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.213896 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.213917 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.233583 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.415874 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:12 crc kubenswrapper[4816]: I0311 12:12:12.649112 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8"] Mar 11 12:12:13 crc kubenswrapper[4816]: I0311 12:12:13.007080 4816 generic.go:334] "Generic (PLEG): container finished" podID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerID="82302b4cf683ff043cbae0c1b55f06bd328dde371bf2d875c664d24a7167590a" exitCode=0 Mar 11 12:12:13 crc kubenswrapper[4816]: I0311 12:12:13.007146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerDied","Data":"82302b4cf683ff043cbae0c1b55f06bd328dde371bf2d875c664d24a7167590a"} Mar 11 12:12:13 crc kubenswrapper[4816]: I0311 12:12:13.007206 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerStarted","Data":"f829741e1ad8b32117f692ad3039b9990bd12ea53fdc942a98b351dfc5dfabe3"} Mar 11 12:12:14 crc kubenswrapper[4816]: I0311 12:12:14.979401 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-blgl4" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" containerID="cri-o://95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" gracePeriod=15 Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.426855 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-blgl4_efc988f7-8a1a-4d22-b6bb-b2617c721017/console/0.log" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.427549 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.471441 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.471820 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472002 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472109 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472278 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472379 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.472879 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473113 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config" (OuterVolumeSpecName: "console-config") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473189 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca" (OuterVolumeSpecName: "service-ca") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473378 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") pod \"efc988f7-8a1a-4d22-b6bb-b2617c721017\" (UID: \"efc988f7-8a1a-4d22-b6bb-b2617c721017\") " Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473858 4816 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473877 4816 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473886 4816 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.473984 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.480147 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.481389 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.481731 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf" (OuterVolumeSpecName: "kube-api-access-nngrf") pod "efc988f7-8a1a-4d22-b6bb-b2617c721017" (UID: "efc988f7-8a1a-4d22-b6bb-b2617c721017"). InnerVolumeSpecName "kube-api-access-nngrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.576212 4816 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.576651 4816 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efc988f7-8a1a-4d22-b6bb-b2617c721017-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.576784 4816 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efc988f7-8a1a-4d22-b6bb-b2617c721017-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:15 crc kubenswrapper[4816]: I0311 12:12:15.577094 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nngrf\" (UniqueName: \"kubernetes.io/projected/efc988f7-8a1a-4d22-b6bb-b2617c721017-kube-api-access-nngrf\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.033462 4816 generic.go:334] "Generic (PLEG): container finished" podID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerID="31e55dad710745ed804d06e0331068676e568a5f49ffe62b8d7bae286d8e48a8" exitCode=0 Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.033605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerDied","Data":"31e55dad710745ed804d06e0331068676e568a5f49ffe62b8d7bae286d8e48a8"} Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.038812 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-blgl4_efc988f7-8a1a-4d22-b6bb-b2617c721017/console/0.log" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.039630 4816 generic.go:334] "Generic (PLEG): container finished" podID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerID="95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" exitCode=2 Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.039705 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-blgl4" event={"ID":"efc988f7-8a1a-4d22-b6bb-b2617c721017","Type":"ContainerDied","Data":"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec"} Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.039765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-blgl4" event={"ID":"efc988f7-8a1a-4d22-b6bb-b2617c721017","Type":"ContainerDied","Data":"59a99708271969fdd60bd64b8768b6f0fa05af801e0f7d034beaae8d3d4be471"} Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.039804 4816 scope.go:117] "RemoveContainer" containerID="95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.040871 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-blgl4" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.076630 4816 scope.go:117] "RemoveContainer" containerID="95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" Mar 11 12:12:16 crc kubenswrapper[4816]: E0311 12:12:16.077628 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec\": container with ID starting with 95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec not found: ID does not exist" containerID="95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.077731 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec"} err="failed to get container status \"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec\": rpc error: code = NotFound desc = could not find container \"95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec\": container with ID starting with 95ff1646ffa7c04a2ecc19c185617a275f308d36bb7c3ee54b57d9bc7db028ec not found: ID does not exist" Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.159846 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:12:16 crc kubenswrapper[4816]: I0311 12:12:16.167899 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-blgl4"] Mar 11 12:12:17 crc kubenswrapper[4816]: I0311 12:12:17.050192 4816 generic.go:334] "Generic (PLEG): container finished" podID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerID="52eac97111b4b75a9861e04ef8afb3cfe3ad6fa5a574b7b896f5fa95a5a9fb59" exitCode=0 Mar 11 12:12:17 crc kubenswrapper[4816]: I0311 12:12:17.050328 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerDied","Data":"52eac97111b4b75a9861e04ef8afb3cfe3ad6fa5a574b7b896f5fa95a5a9fb59"} Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.139746 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" path="/var/lib/kubelet/pods/efc988f7-8a1a-4d22-b6bb-b2617c721017/volumes" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.348634 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.421758 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") pod \"5dff60f3-3acf-4dfb-9098-917736f61c0c\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.422124 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") pod \"5dff60f3-3acf-4dfb-9098-917736f61c0c\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.422274 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") pod \"5dff60f3-3acf-4dfb-9098-917736f61c0c\" (UID: \"5dff60f3-3acf-4dfb-9098-917736f61c0c\") " Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.423308 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle" (OuterVolumeSpecName: "bundle") pod "5dff60f3-3acf-4dfb-9098-917736f61c0c" (UID: "5dff60f3-3acf-4dfb-9098-917736f61c0c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.428605 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn" (OuterVolumeSpecName: "kube-api-access-4dtbn") pod "5dff60f3-3acf-4dfb-9098-917736f61c0c" (UID: "5dff60f3-3acf-4dfb-9098-917736f61c0c"). InnerVolumeSpecName "kube-api-access-4dtbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.533500 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.533570 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dtbn\" (UniqueName: \"kubernetes.io/projected/5dff60f3-3acf-4dfb-9098-917736f61c0c-kube-api-access-4dtbn\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.594515 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util" (OuterVolumeSpecName: "util") pod "5dff60f3-3acf-4dfb-9098-917736f61c0c" (UID: "5dff60f3-3acf-4dfb-9098-917736f61c0c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:12:18 crc kubenswrapper[4816]: I0311 12:12:18.634860 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff60f3-3acf-4dfb-9098-917736f61c0c-util\") on node \"crc\" DevicePath \"\"" Mar 11 12:12:19 crc kubenswrapper[4816]: I0311 12:12:19.070618 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" event={"ID":"5dff60f3-3acf-4dfb-9098-917736f61c0c","Type":"ContainerDied","Data":"f829741e1ad8b32117f692ad3039b9990bd12ea53fdc942a98b351dfc5dfabe3"} Mar 11 12:12:19 crc kubenswrapper[4816]: I0311 12:12:19.070676 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f829741e1ad8b32117f692ad3039b9990bd12ea53fdc942a98b351dfc5dfabe3" Mar 11 12:12:19 crc kubenswrapper[4816]: I0311 12:12:19.070738 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.585148 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5679b59769-8stwg"] Mar 11 12:12:27 crc kubenswrapper[4816]: E0311 12:12:27.585939 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="pull" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.585956 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="pull" Mar 11 12:12:27 crc kubenswrapper[4816]: E0311 12:12:27.585972 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.585980 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" Mar 11 12:12:27 crc kubenswrapper[4816]: E0311 12:12:27.585994 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="util" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586001 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="util" Mar 11 12:12:27 crc kubenswrapper[4816]: E0311 12:12:27.586023 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="extract" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586030 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="extract" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586133 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc988f7-8a1a-4d22-b6bb-b2617c721017" containerName="console" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586150 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dff60f3-3acf-4dfb-9098-917736f61c0c" containerName="extract" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.586635 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.589096 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zs7v5" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.589238 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.589499 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.589765 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.590053 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.642630 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5679b59769-8stwg"] Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.652836 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-apiservice-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.652930 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzjg\" (UniqueName: \"kubernetes.io/projected/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-kube-api-access-ztzjg\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.652975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-webhook-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.754621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzjg\" (UniqueName: \"kubernetes.io/projected/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-kube-api-access-ztzjg\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.754990 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-webhook-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.755172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-apiservice-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.761369 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-webhook-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.761789 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-apiservice-cert\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.774574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzjg\" (UniqueName: \"kubernetes.io/projected/7f7c9c4d-3a3f-4524-8964-8a99f24c2786-kube-api-access-ztzjg\") pod \"metallb-operator-controller-manager-5679b59769-8stwg\" (UID: \"7f7c9c4d-3a3f-4524-8964-8a99f24c2786\") " pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.828775 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz"] Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.829491 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.833349 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wwfq8" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.833545 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.834094 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.848966 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz"] Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.856870 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-webhook-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.857227 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qfn8\" (UniqueName: \"kubernetes.io/projected/72342d10-d8c0-4f04-9554-e57c84d77653-kube-api-access-8qfn8\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.857372 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-apiservice-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.903121 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.959066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-webhook-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.959125 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qfn8\" (UniqueName: \"kubernetes.io/projected/72342d10-d8c0-4f04-9554-e57c84d77653-kube-api-access-8qfn8\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.959211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-apiservice-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.963701 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-webhook-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.963744 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72342d10-d8c0-4f04-9554-e57c84d77653-apiservice-cert\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:27 crc kubenswrapper[4816]: I0311 12:12:27.985241 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qfn8\" (UniqueName: \"kubernetes.io/projected/72342d10-d8c0-4f04-9554-e57c84d77653-kube-api-access-8qfn8\") pod \"metallb-operator-webhook-server-96bb59846-7z5mz\" (UID: \"72342d10-d8c0-4f04-9554-e57c84d77653\") " pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:28 crc kubenswrapper[4816]: I0311 12:12:28.147516 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:28 crc kubenswrapper[4816]: I0311 12:12:28.386731 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5679b59769-8stwg"] Mar 11 12:12:28 crc kubenswrapper[4816]: I0311 12:12:28.632976 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz"] Mar 11 12:12:28 crc kubenswrapper[4816]: W0311 12:12:28.643696 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72342d10_d8c0_4f04_9554_e57c84d77653.slice/crio-08441a46d180046d808092e32b22e396e5fce174c3ba4335a9fa5019f5267b82 WatchSource:0}: Error finding container 08441a46d180046d808092e32b22e396e5fce174c3ba4335a9fa5019f5267b82: Status 404 returned error can't find the container with id 08441a46d180046d808092e32b22e396e5fce174c3ba4335a9fa5019f5267b82 Mar 11 12:12:29 crc kubenswrapper[4816]: I0311 12:12:29.136519 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" event={"ID":"72342d10-d8c0-4f04-9554-e57c84d77653","Type":"ContainerStarted","Data":"08441a46d180046d808092e32b22e396e5fce174c3ba4335a9fa5019f5267b82"} Mar 11 12:12:29 crc kubenswrapper[4816]: I0311 12:12:29.137633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" event={"ID":"7f7c9c4d-3a3f-4524-8964-8a99f24c2786","Type":"ContainerStarted","Data":"a11cf9aaf1ecddcd77fb2049cfeb363445447d59990dab89091820bec520e515"} Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.194896 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" event={"ID":"72342d10-d8c0-4f04-9554-e57c84d77653","Type":"ContainerStarted","Data":"89891833c721f256f8d2cd36bd0263ece32c7b4d3a6c7b3f104aae41659ecc08"} Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.196426 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.199539 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" event={"ID":"7f7c9c4d-3a3f-4524-8964-8a99f24c2786","Type":"ContainerStarted","Data":"ae00879262f70582385d8c1328565352a267e8e8c14456ddb3b3a385b4befb00"} Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.200294 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.226357 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" podStartSLOduration=2.203498951 podStartE2EDuration="7.226338289s" podCreationTimestamp="2026-03-11 12:12:27 +0000 UTC" firstStartedPulling="2026-03-11 12:12:28.645729607 +0000 UTC m=+835.236993564" lastFinishedPulling="2026-03-11 12:12:33.668568935 +0000 UTC m=+840.259832902" observedRunningTime="2026-03-11 12:12:34.223661012 +0000 UTC m=+840.814924979" watchObservedRunningTime="2026-03-11 12:12:34.226338289 +0000 UTC m=+840.817602266" Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.261397 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" podStartSLOduration=2.00094873 podStartE2EDuration="7.261365815s" podCreationTimestamp="2026-03-11 12:12:27 +0000 UTC" firstStartedPulling="2026-03-11 12:12:28.406585995 +0000 UTC m=+834.997849962" lastFinishedPulling="2026-03-11 12:12:33.66700307 +0000 UTC m=+840.258267047" observedRunningTime="2026-03-11 12:12:34.260706986 +0000 UTC m=+840.851970953" watchObservedRunningTime="2026-03-11 12:12:34.261365815 +0000 UTC m=+840.852629782" Mar 11 12:12:34 crc kubenswrapper[4816]: I0311 12:12:34.583239 4816 scope.go:117] "RemoveContainer" containerID="587884e89c7feed672cb66139bc979bafcbc72560d3687797e97ca922f238ebb" Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.515638 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.516627 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.516708 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.517602 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:12:39 crc kubenswrapper[4816]: I0311 12:12:39.517666 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3" gracePeriod=600 Mar 11 12:12:40 crc kubenswrapper[4816]: I0311 12:12:40.255268 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3" exitCode=0 Mar 11 12:12:40 crc kubenswrapper[4816]: I0311 12:12:40.255357 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3"} Mar 11 12:12:40 crc kubenswrapper[4816]: I0311 12:12:40.255588 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507"} Mar 11 12:12:40 crc kubenswrapper[4816]: I0311 12:12:40.255620 4816 scope.go:117] "RemoveContainer" containerID="233fffa5de6ee1e762a8824b32dec71fe3b7403332cc2d914d3770d768c1fbca" Mar 11 12:12:48 crc kubenswrapper[4816]: I0311 12:12:48.160633 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-96bb59846-7z5mz" Mar 11 12:13:07 crc kubenswrapper[4816]: I0311 12:13:07.906981 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5679b59769-8stwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.622625 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.623438 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.625259 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.625617 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q7hhh" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.633551 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bjfwg"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.636528 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.637382 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.639372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.639638 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672201 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-reloader\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6512814f-09cf-4b97-a1d6-ec99bcbf1525-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672690 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-startup\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672719 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4b7\" (UniqueName: \"kubernetes.io/projected/6512814f-09cf-4b97-a1d6-ec99bcbf1525-kube-api-access-vt4b7\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672752 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-conf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672787 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djhgf\" (UniqueName: \"kubernetes.io/projected/00616041-f382-4b2a-a7ef-b75a14621ce1-kube-api-access-djhgf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672808 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.672826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-sockets\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.714783 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wqwrt"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.716042 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.718044 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lzmwq" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.718279 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.718611 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.719355 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.751420 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-srnjf"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.752367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.755260 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.765556 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-srnjf"] Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-conf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djhgf\" (UniqueName: \"kubernetes.io/projected/00616041-f382-4b2a-a7ef-b75a14621ce1-kube-api-access-djhgf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773929 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773946 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-cert\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.773991 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-sockets\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnj7\" (UniqueName: \"kubernetes.io/projected/2af0656a-169d-42fe-8efb-5258bc56af56-kube-api-access-llnj7\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774045 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774064 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-reloader\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774091 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774113 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6512814f-09cf-4b97-a1d6-ec99bcbf1525-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774131 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-startup\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774152 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metallb-excludel2\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774174 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4b7\" (UniqueName: \"kubernetes.io/projected/6512814f-09cf-4b97-a1d6-ec99bcbf1525-kube-api-access-vt4b7\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774195 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwcv9\" (UniqueName: \"kubernetes.io/projected/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-kube-api-access-cwcv9\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.774681 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-conf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.775129 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.775343 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-sockets\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.775426 4816 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.775472 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs podName:00616041-f382-4b2a-a7ef-b75a14621ce1 nodeName:}" failed. No retries permitted until 2026-03-11 12:13:09.275455058 +0000 UTC m=+875.866719025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs") pod "frr-k8s-bjfwg" (UID: "00616041-f382-4b2a-a7ef-b75a14621ce1") : secret "frr-k8s-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.775650 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/00616041-f382-4b2a-a7ef-b75a14621ce1-reloader\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.782499 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/00616041-f382-4b2a-a7ef-b75a14621ce1-frr-startup\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.797168 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6512814f-09cf-4b97-a1d6-ec99bcbf1525-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.800415 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4b7\" (UniqueName: \"kubernetes.io/projected/6512814f-09cf-4b97-a1d6-ec99bcbf1525-kube-api-access-vt4b7\") pod \"frr-k8s-webhook-server-bcc4b6f68-h8scg\" (UID: \"6512814f-09cf-4b97-a1d6-ec99bcbf1525\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.804008 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djhgf\" (UniqueName: \"kubernetes.io/projected/00616041-f382-4b2a-a7ef-b75a14621ce1-kube-api-access-djhgf\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875547 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875594 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-cert\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875630 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnj7\" (UniqueName: \"kubernetes.io/projected/2af0656a-169d-42fe-8efb-5258bc56af56-kube-api-access-llnj7\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875719 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.875752 4816 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.875840 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs podName:43ec0f0d-8425-4dc4-9aa2-f1f85a26548c nodeName:}" failed. No retries permitted until 2026-03-11 12:13:09.375817551 +0000 UTC m=+875.967081518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs") pod "speaker-wqwrt" (UID: "43ec0f0d-8425-4dc4-9aa2-f1f85a26548c") : secret "speaker-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.875995 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metallb-excludel2\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.876029 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwcv9\" (UniqueName: \"kubernetes.io/projected/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-kube-api-access-cwcv9\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.876046 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.876060 4816 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.876182 4816 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.876188 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist podName:43ec0f0d-8425-4dc4-9aa2-f1f85a26548c nodeName:}" failed. No retries permitted until 2026-03-11 12:13:09.376154061 +0000 UTC m=+875.967418028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist") pod "speaker-wqwrt" (UID: "43ec0f0d-8425-4dc4-9aa2-f1f85a26548c") : secret "metallb-memberlist" not found Mar 11 12:13:08 crc kubenswrapper[4816]: E0311 12:13:08.876292 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs podName:2af0656a-169d-42fe-8efb-5258bc56af56 nodeName:}" failed. No retries permitted until 2026-03-11 12:13:09.376276604 +0000 UTC m=+875.967540571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs") pod "controller-7bb4cc7c98-srnjf" (UID: "2af0656a-169d-42fe-8efb-5258bc56af56") : secret "controller-certs-secret" not found Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.876895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metallb-excludel2\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.878554 4816 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.890685 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-cert\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.896676 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwcv9\" (UniqueName: \"kubernetes.io/projected/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-kube-api-access-cwcv9\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.908918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnj7\" (UniqueName: \"kubernetes.io/projected/2af0656a-169d-42fe-8efb-5258bc56af56-kube-api-access-llnj7\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:08 crc kubenswrapper[4816]: I0311 12:13:08.959384 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.221654 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg"] Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.283737 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.288631 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00616041-f382-4b2a-a7ef-b75a14621ce1-metrics-certs\") pod \"frr-k8s-bjfwg\" (UID: \"00616041-f382-4b2a-a7ef-b75a14621ce1\") " pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.384906 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.384983 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.385011 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:09 crc kubenswrapper[4816]: E0311 12:13:09.385580 4816 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 12:13:09 crc kubenswrapper[4816]: E0311 12:13:09.385694 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist podName:43ec0f0d-8425-4dc4-9aa2-f1f85a26548c nodeName:}" failed. No retries permitted until 2026-03-11 12:13:10.385671141 +0000 UTC m=+876.976935118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist") pod "speaker-wqwrt" (UID: "43ec0f0d-8425-4dc4-9aa2-f1f85a26548c") : secret "metallb-memberlist" not found Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.388808 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2af0656a-169d-42fe-8efb-5258bc56af56-metrics-certs\") pod \"controller-7bb4cc7c98-srnjf\" (UID: \"2af0656a-169d-42fe-8efb-5258bc56af56\") " pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.389145 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-metrics-certs\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.462746 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" event={"ID":"6512814f-09cf-4b97-a1d6-ec99bcbf1525","Type":"ContainerStarted","Data":"17f73d8496f7bee3e2dcc587cb93e188b62354803564e1f3f60b89434f9e3441"} Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.568944 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.667400 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:09 crc kubenswrapper[4816]: I0311 12:13:09.937743 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-srnjf"] Mar 11 12:13:09 crc kubenswrapper[4816]: W0311 12:13:09.950630 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af0656a_169d_42fe_8efb_5258bc56af56.slice/crio-0afe1870f3777aa43016754bf2643fbfc2fb505de918741892f71c54c899edaf WatchSource:0}: Error finding container 0afe1870f3777aa43016754bf2643fbfc2fb505de918741892f71c54c899edaf: Status 404 returned error can't find the container with id 0afe1870f3777aa43016754bf2643fbfc2fb505de918741892f71c54c899edaf Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.405757 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.425675 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/43ec0f0d-8425-4dc4-9aa2-f1f85a26548c-memberlist\") pod \"speaker-wqwrt\" (UID: \"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c\") " pod="metallb-system/speaker-wqwrt" Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.472336 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-srnjf" event={"ID":"2af0656a-169d-42fe-8efb-5258bc56af56","Type":"ContainerStarted","Data":"26f61a4f99d2a72d14730c0d17560bfffb50a7eea0e31c5b4dea48188c53e6fb"} Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.472382 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-srnjf" event={"ID":"2af0656a-169d-42fe-8efb-5258bc56af56","Type":"ContainerStarted","Data":"656ef519ad1181d162b9fe8ea90de48e7d48355099403e9d619b445009dd139c"} Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.472393 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-srnjf" event={"ID":"2af0656a-169d-42fe-8efb-5258bc56af56","Type":"ContainerStarted","Data":"0afe1870f3777aa43016754bf2643fbfc2fb505de918741892f71c54c899edaf"} Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.473129 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.474260 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"5a9dba4926deba2e8b891f2db18e7ce4c68880e9a96831827ae9c38bbe5f2c82"} Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.490237 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-srnjf" podStartSLOduration=2.490210548 podStartE2EDuration="2.490210548s" podCreationTimestamp="2026-03-11 12:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:13:10.486280293 +0000 UTC m=+877.077544270" watchObservedRunningTime="2026-03-11 12:13:10.490210548 +0000 UTC m=+877.081474525" Mar 11 12:13:10 crc kubenswrapper[4816]: I0311 12:13:10.530238 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wqwrt" Mar 11 12:13:10 crc kubenswrapper[4816]: W0311 12:13:10.550384 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ec0f0d_8425_4dc4_9aa2_f1f85a26548c.slice/crio-33eb6899821d126ac4e3419b0387e50f85c3ebaca59cfbc9c9f2233bc4cc33f1 WatchSource:0}: Error finding container 33eb6899821d126ac4e3419b0387e50f85c3ebaca59cfbc9c9f2233bc4cc33f1: Status 404 returned error can't find the container with id 33eb6899821d126ac4e3419b0387e50f85c3ebaca59cfbc9c9f2233bc4cc33f1 Mar 11 12:13:11 crc kubenswrapper[4816]: I0311 12:13:11.484997 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wqwrt" event={"ID":"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c","Type":"ContainerStarted","Data":"56aeba0abf7b5e7e1d66f34e132c4143f9384145ef2cab676af43e201f2dc56d"} Mar 11 12:13:11 crc kubenswrapper[4816]: I0311 12:13:11.485282 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wqwrt" event={"ID":"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c","Type":"ContainerStarted","Data":"f0c9fce879c4599b760a54e8ac88a1933ff3d5cfce1c9f9e04addf1fbefaa8b9"} Mar 11 12:13:11 crc kubenswrapper[4816]: I0311 12:13:11.485292 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wqwrt" event={"ID":"43ec0f0d-8425-4dc4-9aa2-f1f85a26548c","Type":"ContainerStarted","Data":"33eb6899821d126ac4e3419b0387e50f85c3ebaca59cfbc9c9f2233bc4cc33f1"} Mar 11 12:13:11 crc kubenswrapper[4816]: I0311 12:13:11.485376 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wqwrt" Mar 11 12:13:14 crc kubenswrapper[4816]: I0311 12:13:14.154648 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wqwrt" podStartSLOduration=6.154629544 podStartE2EDuration="6.154629544s" podCreationTimestamp="2026-03-11 12:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:13:11.507910939 +0000 UTC m=+878.099174906" watchObservedRunningTime="2026-03-11 12:13:14.154629544 +0000 UTC m=+880.745893531" Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.568467 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" event={"ID":"6512814f-09cf-4b97-a1d6-ec99bcbf1525","Type":"ContainerStarted","Data":"99c19e17cdcff645fffab67a9ef4bb0de3f6f0cf791dc49b158ed9a3fdf7dd6f"} Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.569220 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.570728 4816 generic.go:334] "Generic (PLEG): container finished" podID="00616041-f382-4b2a-a7ef-b75a14621ce1" containerID="80d63f9330ab5cde4f6b1a4cc8204f25a2e19e4772b7b59ea3c468ba8092dd4c" exitCode=0 Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.570785 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerDied","Data":"80d63f9330ab5cde4f6b1a4cc8204f25a2e19e4772b7b59ea3c468ba8092dd4c"} Mar 11 12:13:18 crc kubenswrapper[4816]: I0311 12:13:18.616701 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" podStartSLOduration=2.213666838 podStartE2EDuration="10.616683049s" podCreationTimestamp="2026-03-11 12:13:08 +0000 UTC" firstStartedPulling="2026-03-11 12:13:09.230551228 +0000 UTC m=+875.821815195" lastFinishedPulling="2026-03-11 12:13:17.633567419 +0000 UTC m=+884.224831406" observedRunningTime="2026-03-11 12:13:18.591237695 +0000 UTC m=+885.182501662" watchObservedRunningTime="2026-03-11 12:13:18.616683049 +0000 UTC m=+885.207947016" Mar 11 12:13:19 crc kubenswrapper[4816]: I0311 12:13:19.580336 4816 generic.go:334] "Generic (PLEG): container finished" podID="00616041-f382-4b2a-a7ef-b75a14621ce1" containerID="af43bf7bbfdb9660cc95118ff47395fdf22dc547009038f832f159d9c187fa86" exitCode=0 Mar 11 12:13:19 crc kubenswrapper[4816]: I0311 12:13:19.580436 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerDied","Data":"af43bf7bbfdb9660cc95118ff47395fdf22dc547009038f832f159d9c187fa86"} Mar 11 12:13:20 crc kubenswrapper[4816]: I0311 12:13:20.534986 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wqwrt" Mar 11 12:13:20 crc kubenswrapper[4816]: I0311 12:13:20.589854 4816 generic.go:334] "Generic (PLEG): container finished" podID="00616041-f382-4b2a-a7ef-b75a14621ce1" containerID="3f07e740a2d2dea998fafa7f845ed22af997ceb1958d238a27da269158256704" exitCode=0 Mar 11 12:13:20 crc kubenswrapper[4816]: I0311 12:13:20.589911 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerDied","Data":"3f07e740a2d2dea998fafa7f845ed22af997ceb1958d238a27da269158256704"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.600766 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"fd538c5d6fee4cdac18f05446f745df698fa9cfbd7711c66de0900f134715ab0"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601099 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601112 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"b31aaec34f323c4075e4c70efe09df3a55e9f1a929b02ae47f26f89ad0ae1288"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601150 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"6e3f03dce05c9643d39eaa480e95f0e4f42021822a1ae2d8ba30928026a1c9f2"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601160 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"3966ade3f260067a8fd8341fb5bc7c77d0da7251038b72a02ba38dd87423d2f7"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601171 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"ce18729fc1f9a1876ee7d2356e9474d2b58fa559af4903db1d556fd615d2c70c"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.601179 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjfwg" event={"ID":"00616041-f382-4b2a-a7ef-b75a14621ce1","Type":"ContainerStarted","Data":"f849a45ec28d27f080b810fcc5a49d7f4ccfb3711e1dca572cf9b3908c3ad3e7"} Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.620702 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bjfwg" podStartSLOduration=5.667356175 podStartE2EDuration="13.620682355s" podCreationTimestamp="2026-03-11 12:13:08 +0000 UTC" firstStartedPulling="2026-03-11 12:13:09.697950237 +0000 UTC m=+876.289214234" lastFinishedPulling="2026-03-11 12:13:17.651276447 +0000 UTC m=+884.242540414" observedRunningTime="2026-03-11 12:13:21.618748039 +0000 UTC m=+888.210012006" watchObservedRunningTime="2026-03-11 12:13:21.620682355 +0000 UTC m=+888.211946322" Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.969422 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2"] Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.970837 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.975548 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 12:13:21 crc kubenswrapper[4816]: I0311 12:13:21.989413 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2"] Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.109475 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.109604 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.109718 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211561 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211937 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.211989 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.231287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.293831 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:22 crc kubenswrapper[4816]: I0311 12:13:22.774549 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2"] Mar 11 12:13:22 crc kubenswrapper[4816]: W0311 12:13:22.784382 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c6bbc7_62af_4c3a_ac05_1897b9f00080.slice/crio-5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd WatchSource:0}: Error finding container 5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd: Status 404 returned error can't find the container with id 5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd Mar 11 12:13:23 crc kubenswrapper[4816]: I0311 12:13:23.616869 4816 generic.go:334] "Generic (PLEG): container finished" podID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerID="5700c7771a4c284d0e3dc33cb363063ce713c4c0827d2527d184d380ea4d0beb" exitCode=0 Mar 11 12:13:23 crc kubenswrapper[4816]: I0311 12:13:23.617019 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerDied","Data":"5700c7771a4c284d0e3dc33cb363063ce713c4c0827d2527d184d380ea4d0beb"} Mar 11 12:13:23 crc kubenswrapper[4816]: I0311 12:13:23.617297 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerStarted","Data":"5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd"} Mar 11 12:13:24 crc kubenswrapper[4816]: I0311 12:13:24.569190 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:24 crc kubenswrapper[4816]: I0311 12:13:24.639824 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:27 crc kubenswrapper[4816]: I0311 12:13:27.672716 4816 generic.go:334] "Generic (PLEG): container finished" podID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerID="c309b14826419376450e34ef7e71c711ef13f1aa8e7a532324a580d7febe6e32" exitCode=0 Mar 11 12:13:27 crc kubenswrapper[4816]: I0311 12:13:27.672851 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerDied","Data":"c309b14826419376450e34ef7e71c711ef13f1aa8e7a532324a580d7febe6e32"} Mar 11 12:13:28 crc kubenswrapper[4816]: I0311 12:13:28.684951 4816 generic.go:334] "Generic (PLEG): container finished" podID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerID="5f6bb57f799d21785887f4df7ec76a35a9787bef2a1f8eece1bb78c306d6fc80" exitCode=0 Mar 11 12:13:28 crc kubenswrapper[4816]: I0311 12:13:28.685083 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerDied","Data":"5f6bb57f799d21785887f4df7ec76a35a9787bef2a1f8eece1bb78c306d6fc80"} Mar 11 12:13:28 crc kubenswrapper[4816]: I0311 12:13:28.966352 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-h8scg" Mar 11 12:13:29 crc kubenswrapper[4816]: I0311 12:13:29.673170 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-srnjf" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.038268 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.137877 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") pod \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.138002 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") pod \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.138091 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") pod \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\" (UID: \"f9c6bbc7-62af-4c3a-ac05-1897b9f00080\") " Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.138769 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle" (OuterVolumeSpecName: "bundle") pod "f9c6bbc7-62af-4c3a-ac05-1897b9f00080" (UID: "f9c6bbc7-62af-4c3a-ac05-1897b9f00080"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.147548 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util" (OuterVolumeSpecName: "util") pod "f9c6bbc7-62af-4c3a-ac05-1897b9f00080" (UID: "f9c6bbc7-62af-4c3a-ac05-1897b9f00080"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.153864 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr" (OuterVolumeSpecName: "kube-api-access-56ksr") pod "f9c6bbc7-62af-4c3a-ac05-1897b9f00080" (UID: "f9c6bbc7-62af-4c3a-ac05-1897b9f00080"). InnerVolumeSpecName "kube-api-access-56ksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.240620 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-util\") on node \"crc\" DevicePath \"\"" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.240834 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.240897 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56ksr\" (UniqueName: \"kubernetes.io/projected/f9c6bbc7-62af-4c3a-ac05-1897b9f00080-kube-api-access-56ksr\") on node \"crc\" DevicePath \"\"" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.705765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" event={"ID":"f9c6bbc7-62af-4c3a-ac05-1897b9f00080","Type":"ContainerDied","Data":"5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd"} Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.705824 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a50909b238048d7291670196a1a9dcbfff784f61d6b5145b42b735b88bf62cd" Mar 11 12:13:30 crc kubenswrapper[4816]: I0311 12:13:30.705875 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.101573 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5"] Mar 11 12:13:35 crc kubenswrapper[4816]: E0311 12:13:35.102720 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="util" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.102755 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="util" Mar 11 12:13:35 crc kubenswrapper[4816]: E0311 12:13:35.102799 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="extract" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.102818 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="extract" Mar 11 12:13:35 crc kubenswrapper[4816]: E0311 12:13:35.102841 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="pull" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.102855 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="pull" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.103071 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c6bbc7-62af-4c3a-ac05-1897b9f00080" containerName="extract" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.104018 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.105870 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.106059 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-8czc4" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.108037 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.139847 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5"] Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.217302 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/950f7daa-b6eb-488f-877d-774c73576ed0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.217513 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xvq2\" (UniqueName: \"kubernetes.io/projected/950f7daa-b6eb-488f-877d-774c73576ed0-kube-api-access-2xvq2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.319120 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/950f7daa-b6eb-488f-877d-774c73576ed0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.319221 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xvq2\" (UniqueName: \"kubernetes.io/projected/950f7daa-b6eb-488f-877d-774c73576ed0-kube-api-access-2xvq2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.320156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/950f7daa-b6eb-488f-877d-774c73576ed0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.344733 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xvq2\" (UniqueName: \"kubernetes.io/projected/950f7daa-b6eb-488f-877d-774c73576ed0-kube-api-access-2xvq2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-bpwb5\" (UID: \"950f7daa-b6eb-488f-877d-774c73576ed0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.449606 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" Mar 11 12:13:35 crc kubenswrapper[4816]: I0311 12:13:35.769951 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5"] Mar 11 12:13:36 crc kubenswrapper[4816]: I0311 12:13:36.743653 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" event={"ID":"950f7daa-b6eb-488f-877d-774c73576ed0","Type":"ContainerStarted","Data":"c21eec66beaf6443bc73563e51c1f973bf98c031492c7dbd502a2d0bcca29bcf"} Mar 11 12:13:39 crc kubenswrapper[4816]: I0311 12:13:39.574718 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bjfwg" Mar 11 12:13:39 crc kubenswrapper[4816]: I0311 12:13:39.769894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" event={"ID":"950f7daa-b6eb-488f-877d-774c73576ed0","Type":"ContainerStarted","Data":"896fe8d71362e8ef9f1d3c398422eedc31ba0fe47a90af6c276e204e8ec911fb"} Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.148751 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-bpwb5" podStartSLOduration=5.484942669 podStartE2EDuration="9.148728416s" podCreationTimestamp="2026-03-11 12:13:35 +0000 UTC" firstStartedPulling="2026-03-11 12:13:35.783388053 +0000 UTC m=+902.374652030" lastFinishedPulling="2026-03-11 12:13:39.44717381 +0000 UTC m=+906.038437777" observedRunningTime="2026-03-11 12:13:39.827887876 +0000 UTC m=+906.419151843" watchObservedRunningTime="2026-03-11 12:13:44.148728416 +0000 UTC m=+910.739992383" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.151587 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2jk7k"] Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.153193 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.157152 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-n6ssm" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.157161 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.157622 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.166997 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2jk7k"] Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.260402 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n7vv\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-kube-api-access-7n7vv\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.260487 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.362338 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.362454 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7vv\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-kube-api-access-7n7vv\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.381392 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n7vv\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-kube-api-access-7n7vv\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.384364 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a41e6b9-3b80-4eed-a8db-65aa010f449d-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2jk7k\" (UID: \"0a41e6b9-3b80-4eed-a8db-65aa010f449d\") " pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.474569 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:44 crc kubenswrapper[4816]: I0311 12:13:44.929641 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2jk7k"] Mar 11 12:13:45 crc kubenswrapper[4816]: I0311 12:13:45.814005 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" event={"ID":"0a41e6b9-3b80-4eed-a8db-65aa010f449d","Type":"ContainerStarted","Data":"76f7aff87bcdf58aca3226d7bd33c4c416fb8440f06999d8f9216cbb0e9fe1b5"} Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.387802 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.389346 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.408390 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.524263 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.524298 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.524323 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.625699 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.625744 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.625774 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.626391 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.626565 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.649945 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") pod \"community-operators-z6224\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:48 crc kubenswrapper[4816]: I0311 12:13:48.710862 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.256913 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fgzw7"] Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.257830 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.261575 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-z54x9" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.268169 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fgzw7"] Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.353143 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.353200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szzdz\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-kube-api-access-szzdz\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.454110 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.454170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szzdz\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-kube-api-access-szzdz\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.472954 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.485439 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szzdz\" (UniqueName: \"kubernetes.io/projected/f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8-kube-api-access-szzdz\") pod \"cert-manager-cainjector-5545bd876-fgzw7\" (UID: \"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:49 crc kubenswrapper[4816]: I0311 12:13:49.578595 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.795404 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.797722 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.815977 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.885187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.885336 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.885400 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.987504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.987613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.987651 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.989649 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.990193 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:50 crc kubenswrapper[4816]: I0311 12:13:50.991614 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:13:51 crc kubenswrapper[4816]: W0311 12:13:51.003449 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb816e33_9a16_4326_8058_c328fabcab45.slice/crio-7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be WatchSource:0}: Error finding container 7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be: Status 404 returned error can't find the container with id 7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.044872 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") pod \"certified-operators-sn8n6\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.124831 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fgzw7"] Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.170290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.431864 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:13:51 crc kubenswrapper[4816]: W0311 12:13:51.437182 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0a6696_3d0e_4173_9c1c_260bfdd757fb.slice/crio-1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c WatchSource:0}: Error finding container 1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c: Status 404 returned error can't find the container with id 1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.890306 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerID="39208e7933d3e745276296861bf7019ae82dfe513403abeb752230d2b7d5f71e" exitCode=0 Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.890399 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerDied","Data":"39208e7933d3e745276296861bf7019ae82dfe513403abeb752230d2b7d5f71e"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.890449 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerStarted","Data":"1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.893697 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" event={"ID":"0a41e6b9-3b80-4eed-a8db-65aa010f449d","Type":"ContainerStarted","Data":"977d30cdf14676945adae36064bcac3a2355a3f11d66a1ec67fe0d29af62ae53"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.894097 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.896672 4816 generic.go:334] "Generic (PLEG): container finished" podID="bb816e33-9a16-4326-8058-c328fabcab45" containerID="b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2" exitCode=0 Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.896725 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerDied","Data":"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.896750 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerStarted","Data":"7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.898400 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" event={"ID":"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8","Type":"ContainerStarted","Data":"1d11c941cff6d506aff5ffb7382bcdc6782f901e85211c15cefc00dcdfda4296"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.898431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" event={"ID":"f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8","Type":"ContainerStarted","Data":"1b46c040faba5a5471994d87bab1a61c54b7186223efa24040c1e0f27e8be2b0"} Mar 11 12:13:51 crc kubenswrapper[4816]: I0311 12:13:51.950671 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" podStartSLOduration=2.080280661 podStartE2EDuration="7.950650471s" podCreationTimestamp="2026-03-11 12:13:44 +0000 UTC" firstStartedPulling="2026-03-11 12:13:44.937522773 +0000 UTC m=+911.528786730" lastFinishedPulling="2026-03-11 12:13:50.807892583 +0000 UTC m=+917.399156540" observedRunningTime="2026-03-11 12:13:51.94553552 +0000 UTC m=+918.536799487" watchObservedRunningTime="2026-03-11 12:13:51.950650471 +0000 UTC m=+918.541914428" Mar 11 12:13:52 crc kubenswrapper[4816]: I0311 12:13:52.007297 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-fgzw7" podStartSLOduration=3.007276316 podStartE2EDuration="3.007276316s" podCreationTimestamp="2026-03-11 12:13:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:13:51.999005953 +0000 UTC m=+918.590269920" watchObservedRunningTime="2026-03-11 12:13:52.007276316 +0000 UTC m=+918.598540283" Mar 11 12:13:52 crc kubenswrapper[4816]: I0311 12:13:52.905028 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerStarted","Data":"60d6fa46e4d2deba8d1be2490841b0784c3be0ee06e9d85e187cb81fe38f5f2c"} Mar 11 12:13:52 crc kubenswrapper[4816]: I0311 12:13:52.907272 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerStarted","Data":"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89"} Mar 11 12:13:53 crc kubenswrapper[4816]: I0311 12:13:53.914830 4816 generic.go:334] "Generic (PLEG): container finished" podID="bb816e33-9a16-4326-8058-c328fabcab45" containerID="50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89" exitCode=0 Mar 11 12:13:53 crc kubenswrapper[4816]: I0311 12:13:53.914910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerDied","Data":"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89"} Mar 11 12:13:53 crc kubenswrapper[4816]: I0311 12:13:53.916938 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerID="60d6fa46e4d2deba8d1be2490841b0784c3be0ee06e9d85e187cb81fe38f5f2c" exitCode=0 Mar 11 12:13:53 crc kubenswrapper[4816]: I0311 12:13:53.916986 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerDied","Data":"60d6fa46e4d2deba8d1be2490841b0784c3be0ee06e9d85e187cb81fe38f5f2c"} Mar 11 12:13:54 crc kubenswrapper[4816]: I0311 12:13:54.927229 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerStarted","Data":"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf"} Mar 11 12:13:54 crc kubenswrapper[4816]: I0311 12:13:54.932000 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerStarted","Data":"81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df"} Mar 11 12:13:54 crc kubenswrapper[4816]: I0311 12:13:54.951999 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z6224" podStartSLOduration=4.370797718 podStartE2EDuration="6.951982556s" podCreationTimestamp="2026-03-11 12:13:48 +0000 UTC" firstStartedPulling="2026-03-11 12:13:51.898342673 +0000 UTC m=+918.489606640" lastFinishedPulling="2026-03-11 12:13:54.479527511 +0000 UTC m=+921.070791478" observedRunningTime="2026-03-11 12:13:54.949939256 +0000 UTC m=+921.541203223" watchObservedRunningTime="2026-03-11 12:13:54.951982556 +0000 UTC m=+921.543246523" Mar 11 12:13:54 crc kubenswrapper[4816]: I0311 12:13:54.969805 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sn8n6" podStartSLOduration=2.408874757 podStartE2EDuration="4.969782869s" podCreationTimestamp="2026-03-11 12:13:50 +0000 UTC" firstStartedPulling="2026-03-11 12:13:51.89247009 +0000 UTC m=+918.483734067" lastFinishedPulling="2026-03-11 12:13:54.453378212 +0000 UTC m=+921.044642179" observedRunningTime="2026-03-11 12:13:54.966652227 +0000 UTC m=+921.557916194" watchObservedRunningTime="2026-03-11 12:13:54.969782869 +0000 UTC m=+921.561046836" Mar 11 12:13:58 crc kubenswrapper[4816]: I0311 12:13:58.711975 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:58 crc kubenswrapper[4816]: I0311 12:13:58.712856 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:58 crc kubenswrapper[4816]: I0311 12:13:58.762538 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:13:59 crc kubenswrapper[4816]: I0311 12:13:59.477349 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-2jk7k" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.143155 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.146682 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.146897 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.155642 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.155882 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.156080 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.233130 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") pod \"auto-csr-approver-29553854-hbf96\" (UID: \"af8a107b-6295-42d4-b64b-7841171f67f3\") " pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.335548 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") pod \"auto-csr-approver-29553854-hbf96\" (UID: \"af8a107b-6295-42d4-b64b-7841171f67f3\") " pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.354965 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") pod \"auto-csr-approver-29553854-hbf96\" (UID: \"af8a107b-6295-42d4-b64b-7841171f67f3\") " pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.483487 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.758919 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:14:00 crc kubenswrapper[4816]: I0311 12:14:00.981060 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553854-hbf96" event={"ID":"af8a107b-6295-42d4-b64b-7841171f67f3","Type":"ContainerStarted","Data":"dfea93cb7aee576a6b4033dd21e2e2174a0e4a253adbe8914d90cdec35819e27"} Mar 11 12:14:01 crc kubenswrapper[4816]: I0311 12:14:01.171415 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:01 crc kubenswrapper[4816]: I0311 12:14:01.171471 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:01 crc kubenswrapper[4816]: I0311 12:14:01.222658 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.040802 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.089462 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.567789 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-62cp5"] Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.571062 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.577580 4816 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vh6t5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.594388 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-62cp5"] Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.668716 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-bound-sa-token\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.669447 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szk84\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-kube-api-access-szk84\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.771541 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-bound-sa-token\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.771675 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szk84\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-kube-api-access-szk84\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.801634 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szk84\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-kube-api-access-szk84\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.803871 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e50b3f6b-4679-4337-a9cf-478aa2fb5800-bound-sa-token\") pod \"cert-manager-545d4d4674-62cp5\" (UID: \"e50b3f6b-4679-4337-a9cf-478aa2fb5800\") " pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:02 crc kubenswrapper[4816]: I0311 12:14:02.895693 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-62cp5" Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.210311 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-62cp5"] Mar 11 12:14:03 crc kubenswrapper[4816]: W0311 12:14:03.213630 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode50b3f6b_4679_4337_a9cf_478aa2fb5800.slice/crio-8f6b1cee2306078cb8905b6b956094561ff8d4a532879cffb9364772e92333cd WatchSource:0}: Error finding container 8f6b1cee2306078cb8905b6b956094561ff8d4a532879cffb9364772e92333cd: Status 404 returned error can't find the container with id 8f6b1cee2306078cb8905b6b956094561ff8d4a532879cffb9364772e92333cd Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.866469 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.868514 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.880017 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.992730 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.993155 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:03 crc kubenswrapper[4816]: I0311 12:14:03.993394 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.005912 4816 generic.go:334] "Generic (PLEG): container finished" podID="af8a107b-6295-42d4-b64b-7841171f67f3" containerID="37568547e2b255f52263c2130857ff28c18773cdb28a0d8fb13178ff2dc5ab7f" exitCode=0 Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.006015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553854-hbf96" event={"ID":"af8a107b-6295-42d4-b64b-7841171f67f3","Type":"ContainerDied","Data":"37568547e2b255f52263c2130857ff28c18773cdb28a0d8fb13178ff2dc5ab7f"} Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.007582 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sn8n6" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="registry-server" containerID="cri-o://81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df" gracePeriod=2 Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.008773 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-62cp5" event={"ID":"e50b3f6b-4679-4337-a9cf-478aa2fb5800","Type":"ContainerStarted","Data":"fd2724e590cc7a2a85d7400f164e79ed204e7298dae962307ec0591a74334974"} Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.008799 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-62cp5" event={"ID":"e50b3f6b-4679-4337-a9cf-478aa2fb5800","Type":"ContainerStarted","Data":"8f6b1cee2306078cb8905b6b956094561ff8d4a532879cffb9364772e92333cd"} Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.078695 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-62cp5" podStartSLOduration=2.07866855 podStartE2EDuration="2.07866855s" podCreationTimestamp="2026-03-11 12:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:14:04.073525119 +0000 UTC m=+930.664789076" watchObservedRunningTime="2026-03-11 12:14:04.07866855 +0000 UTC m=+930.669932517" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.095179 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.095238 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.095315 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.095342 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.103670 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.129417 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") pod \"redhat-marketplace-hl4tp\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.201967 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:04 crc kubenswrapper[4816]: I0311 12:14:04.510711 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:04 crc kubenswrapper[4816]: W0311 12:14:04.524027 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60a4785b_2b65_4a82_984d_611750e6e161.slice/crio-f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d WatchSource:0}: Error finding container f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d: Status 404 returned error can't find the container with id f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.015276 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerStarted","Data":"f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d"} Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.323848 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.415754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") pod \"af8a107b-6295-42d4-b64b-7841171f67f3\" (UID: \"af8a107b-6295-42d4-b64b-7841171f67f3\") " Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.425488 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4" (OuterVolumeSpecName: "kube-api-access-5b7s4") pod "af8a107b-6295-42d4-b64b-7841171f67f3" (UID: "af8a107b-6295-42d4-b64b-7841171f67f3"). InnerVolumeSpecName "kube-api-access-5b7s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:05 crc kubenswrapper[4816]: I0311 12:14:05.517989 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b7s4\" (UniqueName: \"kubernetes.io/projected/af8a107b-6295-42d4-b64b-7841171f67f3-kube-api-access-5b7s4\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.026450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553854-hbf96" event={"ID":"af8a107b-6295-42d4-b64b-7841171f67f3","Type":"ContainerDied","Data":"dfea93cb7aee576a6b4033dd21e2e2174a0e4a253adbe8914d90cdec35819e27"} Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.026490 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553854-hbf96" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.026510 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfea93cb7aee576a6b4033dd21e2e2174a0e4a253adbe8914d90cdec35819e27" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.043567 4816 generic.go:334] "Generic (PLEG): container finished" podID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerID="81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df" exitCode=0 Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.043795 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerDied","Data":"81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df"} Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.050903 4816 generic.go:334] "Generic (PLEG): container finished" podID="60a4785b-2b65-4a82-984d-611750e6e161" containerID="39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3" exitCode=0 Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.050971 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerDied","Data":"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3"} Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.236364 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.335580 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") pod \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.335810 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") pod \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.335939 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") pod \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\" (UID: \"ff0a6696-3d0e-4173-9c1c-260bfdd757fb\") " Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.336802 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities" (OuterVolumeSpecName: "utilities") pod "ff0a6696-3d0e-4173-9c1c-260bfdd757fb" (UID: "ff0a6696-3d0e-4173-9c1c-260bfdd757fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.354569 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct" (OuterVolumeSpecName: "kube-api-access-zdjct") pod "ff0a6696-3d0e-4173-9c1c-260bfdd757fb" (UID: "ff0a6696-3d0e-4173-9c1c-260bfdd757fb"). InnerVolumeSpecName "kube-api-access-zdjct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.411292 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff0a6696-3d0e-4173-9c1c-260bfdd757fb" (UID: "ff0a6696-3d0e-4173-9c1c-260bfdd757fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.426464 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.435768 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553848-blbgg"] Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.438227 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.438296 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdjct\" (UniqueName: \"kubernetes.io/projected/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-kube-api-access-zdjct\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:06 crc kubenswrapper[4816]: I0311 12:14:06.438312 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0a6696-3d0e-4173-9c1c-260bfdd757fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.059301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn8n6" event={"ID":"ff0a6696-3d0e-4173-9c1c-260bfdd757fb","Type":"ContainerDied","Data":"1126d3ebcef013ee6f9236b25810177dab2d52145cc78104f25351c76ce07c7c"} Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.059350 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn8n6" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.059373 4816 scope.go:117] "RemoveContainer" containerID="81cdb893a840fa2be39a4c7d019e4bfb168456690721e9768fa1711d4887d3df" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.078841 4816 scope.go:117] "RemoveContainer" containerID="60d6fa46e4d2deba8d1be2490841b0784c3be0ee06e9d85e187cb81fe38f5f2c" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.094680 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.098877 4816 scope.go:117] "RemoveContainer" containerID="39208e7933d3e745276296861bf7019ae82dfe513403abeb752230d2b7d5f71e" Mar 11 12:14:07 crc kubenswrapper[4816]: I0311 12:14:07.099612 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sn8n6"] Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.072605 4816 generic.go:334] "Generic (PLEG): container finished" podID="60a4785b-2b65-4a82-984d-611750e6e161" containerID="e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a" exitCode=0 Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.072686 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerDied","Data":"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a"} Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.143648 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7a354c-a3ec-44fe-8e27-028abd12d7d9" path="/var/lib/kubelet/pods/cf7a354c-a3ec-44fe-8e27-028abd12d7d9/volumes" Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.144932 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" path="/var/lib/kubelet/pods/ff0a6696-3d0e-4173-9c1c-260bfdd757fb/volumes" Mar 11 12:14:08 crc kubenswrapper[4816]: I0311 12:14:08.769129 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:14:09 crc kubenswrapper[4816]: I0311 12:14:09.084671 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerStarted","Data":"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11"} Mar 11 12:14:09 crc kubenswrapper[4816]: I0311 12:14:09.105301 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hl4tp" podStartSLOduration=3.644991573 podStartE2EDuration="6.105279977s" podCreationTimestamp="2026-03-11 12:14:03 +0000 UTC" firstStartedPulling="2026-03-11 12:14:06.054015743 +0000 UTC m=+932.645279710" lastFinishedPulling="2026-03-11 12:14:08.514304147 +0000 UTC m=+935.105568114" observedRunningTime="2026-03-11 12:14:09.101919708 +0000 UTC m=+935.693183675" watchObservedRunningTime="2026-03-11 12:14:09.105279977 +0000 UTC m=+935.696543954" Mar 11 12:14:11 crc kubenswrapper[4816]: I0311 12:14:11.657880 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:14:11 crc kubenswrapper[4816]: I0311 12:14:11.659100 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z6224" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="registry-server" containerID="cri-o://d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" gracePeriod=2 Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.051018 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108672 4816 generic.go:334] "Generic (PLEG): container finished" podID="bb816e33-9a16-4326-8058-c328fabcab45" containerID="d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" exitCode=0 Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108716 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerDied","Data":"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf"} Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108744 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6224" event={"ID":"bb816e33-9a16-4326-8058-c328fabcab45","Type":"ContainerDied","Data":"7dbded14f0f4fb04bf34588fd808017260ed068974b08e77201c2fe4778827be"} Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108763 4816 scope.go:117] "RemoveContainer" containerID="d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.108875 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6224" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.129585 4816 scope.go:117] "RemoveContainer" containerID="50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.144460 4816 scope.go:117] "RemoveContainer" containerID="b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.145232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") pod \"bb816e33-9a16-4326-8058-c328fabcab45\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.145370 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") pod \"bb816e33-9a16-4326-8058-c328fabcab45\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.145396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") pod \"bb816e33-9a16-4326-8058-c328fabcab45\" (UID: \"bb816e33-9a16-4326-8058-c328fabcab45\") " Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.146360 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities" (OuterVolumeSpecName: "utilities") pod "bb816e33-9a16-4326-8058-c328fabcab45" (UID: "bb816e33-9a16-4326-8058-c328fabcab45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.152032 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4" (OuterVolumeSpecName: "kube-api-access-fwbh4") pod "bb816e33-9a16-4326-8058-c328fabcab45" (UID: "bb816e33-9a16-4326-8058-c328fabcab45"). InnerVolumeSpecName "kube-api-access-fwbh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.185453 4816 scope.go:117] "RemoveContainer" containerID="d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" Mar 11 12:14:12 crc kubenswrapper[4816]: E0311 12:14:12.185811 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf\": container with ID starting with d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf not found: ID does not exist" containerID="d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.185848 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf"} err="failed to get container status \"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf\": rpc error: code = NotFound desc = could not find container \"d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf\": container with ID starting with d48b543c419d15ade4d9d40ea1598ce35d14a31ad917766b4e5d4d50431d25bf not found: ID does not exist" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.185877 4816 scope.go:117] "RemoveContainer" containerID="50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89" Mar 11 12:14:12 crc kubenswrapper[4816]: E0311 12:14:12.186102 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89\": container with ID starting with 50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89 not found: ID does not exist" containerID="50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.186123 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89"} err="failed to get container status \"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89\": rpc error: code = NotFound desc = could not find container \"50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89\": container with ID starting with 50dd4694b1375b720ecd5bd8dd4bc0d5000424b020c48dc7352e366a56193c89 not found: ID does not exist" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.186138 4816 scope.go:117] "RemoveContainer" containerID="b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2" Mar 11 12:14:12 crc kubenswrapper[4816]: E0311 12:14:12.186372 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2\": container with ID starting with b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2 not found: ID does not exist" containerID="b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.186392 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2"} err="failed to get container status \"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2\": rpc error: code = NotFound desc = could not find container \"b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2\": container with ID starting with b22b4212fc4320b38df10fecd8bd4695175cbd5d32439913ffcd7f6485b6d8c2 not found: ID does not exist" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.204176 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb816e33-9a16-4326-8058-c328fabcab45" (UID: "bb816e33-9a16-4326-8058-c328fabcab45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.246669 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.246706 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwbh4\" (UniqueName: \"kubernetes.io/projected/bb816e33-9a16-4326-8058-c328fabcab45-kube-api-access-fwbh4\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.246730 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb816e33-9a16-4326-8058-c328fabcab45-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.439178 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:14:12 crc kubenswrapper[4816]: I0311 12:14:12.442947 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z6224"] Mar 11 12:14:14 crc kubenswrapper[4816]: I0311 12:14:14.138645 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb816e33-9a16-4326-8058-c328fabcab45" path="/var/lib/kubelet/pods/bb816e33-9a16-4326-8058-c328fabcab45/volumes" Mar 11 12:14:14 crc kubenswrapper[4816]: I0311 12:14:14.203223 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:14 crc kubenswrapper[4816]: I0311 12:14:14.203322 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:14 crc kubenswrapper[4816]: I0311 12:14:14.244900 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:15 crc kubenswrapper[4816]: I0311 12:14:15.166582 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.472208 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zsrdm"] Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473301 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="extract-utilities" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473515 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="extract-utilities" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473613 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473635 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473718 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8a107b-6295-42d4-b64b-7841171f67f3" containerName="oc" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473784 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8a107b-6295-42d4-b64b-7841171f67f3" containerName="oc" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473815 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="extract-utilities" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473835 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="extract-utilities" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.473912 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="extract-content" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.473982 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="extract-content" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.474021 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474083 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: E0311 12:14:16.474173 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="extract-content" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474201 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="extract-content" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474688 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8a107b-6295-42d4-b64b-7841171f67f3" containerName="oc" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474739 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0a6696-3d0e-4173-9c1c-260bfdd757fb" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.474772 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb816e33-9a16-4326-8058-c328fabcab45" containerName="registry-server" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.475768 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.478120 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.478244 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.478490 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9g5kf" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.482854 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zsrdm"] Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.605532 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzx44\" (UniqueName: \"kubernetes.io/projected/4ed28d20-6f1f-4bb8-853d-284003a6b922-kube-api-access-gzx44\") pod \"openstack-operator-index-zsrdm\" (UID: \"4ed28d20-6f1f-4bb8-853d-284003a6b922\") " pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.707066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzx44\" (UniqueName: \"kubernetes.io/projected/4ed28d20-6f1f-4bb8-853d-284003a6b922-kube-api-access-gzx44\") pod \"openstack-operator-index-zsrdm\" (UID: \"4ed28d20-6f1f-4bb8-853d-284003a6b922\") " pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.735629 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzx44\" (UniqueName: \"kubernetes.io/projected/4ed28d20-6f1f-4bb8-853d-284003a6b922-kube-api-access-gzx44\") pod \"openstack-operator-index-zsrdm\" (UID: \"4ed28d20-6f1f-4bb8-853d-284003a6b922\") " pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:16 crc kubenswrapper[4816]: I0311 12:14:16.804625 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:17 crc kubenswrapper[4816]: I0311 12:14:17.193767 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zsrdm"] Mar 11 12:14:18 crc kubenswrapper[4816]: I0311 12:14:18.147096 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zsrdm" event={"ID":"4ed28d20-6f1f-4bb8-853d-284003a6b922","Type":"ContainerStarted","Data":"bc47e16ddb2554c3a948cfe55fffa01c25b39ecc19919abe73f2cd6c26a10c85"} Mar 11 12:14:20 crc kubenswrapper[4816]: I0311 12:14:20.164931 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zsrdm" event={"ID":"4ed28d20-6f1f-4bb8-853d-284003a6b922","Type":"ContainerStarted","Data":"06547a216a99fb53c5fd48aab18860c55dca1b06c356f47e1deb51a7bcdcc0d7"} Mar 11 12:14:20 crc kubenswrapper[4816]: I0311 12:14:20.181094 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zsrdm" podStartSLOduration=2.340480781 podStartE2EDuration="4.18107326s" podCreationTimestamp="2026-03-11 12:14:16 +0000 UTC" firstStartedPulling="2026-03-11 12:14:17.202768582 +0000 UTC m=+943.794032549" lastFinishedPulling="2026-03-11 12:14:19.043361061 +0000 UTC m=+945.634625028" observedRunningTime="2026-03-11 12:14:20.177050421 +0000 UTC m=+946.768314408" watchObservedRunningTime="2026-03-11 12:14:20.18107326 +0000 UTC m=+946.772337237" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.461879 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.462607 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hl4tp" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="registry-server" containerID="cri-o://8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" gracePeriod=2 Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.835618 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.883371 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") pod \"60a4785b-2b65-4a82-984d-611750e6e161\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.883756 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") pod \"60a4785b-2b65-4a82-984d-611750e6e161\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.883875 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") pod \"60a4785b-2b65-4a82-984d-611750e6e161\" (UID: \"60a4785b-2b65-4a82-984d-611750e6e161\") " Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.884869 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities" (OuterVolumeSpecName: "utilities") pod "60a4785b-2b65-4a82-984d-611750e6e161" (UID: "60a4785b-2b65-4a82-984d-611750e6e161"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.885583 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.892624 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l" (OuterVolumeSpecName: "kube-api-access-2dg7l") pod "60a4785b-2b65-4a82-984d-611750e6e161" (UID: "60a4785b-2b65-4a82-984d-611750e6e161"). InnerVolumeSpecName "kube-api-access-2dg7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.909835 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60a4785b-2b65-4a82-984d-611750e6e161" (UID: "60a4785b-2b65-4a82-984d-611750e6e161"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.986624 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60a4785b-2b65-4a82-984d-611750e6e161-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:21 crc kubenswrapper[4816]: I0311 12:14:21.986660 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dg7l\" (UniqueName: \"kubernetes.io/projected/60a4785b-2b65-4a82-984d-611750e6e161-kube-api-access-2dg7l\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180582 4816 generic.go:334] "Generic (PLEG): container finished" podID="60a4785b-2b65-4a82-984d-611750e6e161" containerID="8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" exitCode=0 Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180630 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerDied","Data":"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11"} Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180663 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl4tp" event={"ID":"60a4785b-2b65-4a82-984d-611750e6e161","Type":"ContainerDied","Data":"f45dd7f636b52fe2c1b2fb545c6a5ddc989520a45cd358b29620164b0db18c6d"} Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180684 4816 scope.go:117] "RemoveContainer" containerID="8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.180816 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl4tp" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.207618 4816 scope.go:117] "RemoveContainer" containerID="e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.215198 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.220316 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl4tp"] Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.645137 4816 scope.go:117] "RemoveContainer" containerID="39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.678151 4816 scope.go:117] "RemoveContainer" containerID="8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" Mar 11 12:14:22 crc kubenswrapper[4816]: E0311 12:14:22.679241 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11\": container with ID starting with 8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11 not found: ID does not exist" containerID="8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.679307 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11"} err="failed to get container status \"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11\": rpc error: code = NotFound desc = could not find container \"8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11\": container with ID starting with 8ff09a236d35220a7c1625025b22e14e4738aad432e4cd1765a46150c683df11 not found: ID does not exist" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.679339 4816 scope.go:117] "RemoveContainer" containerID="e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a" Mar 11 12:14:22 crc kubenswrapper[4816]: E0311 12:14:22.679689 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a\": container with ID starting with e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a not found: ID does not exist" containerID="e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.679731 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a"} err="failed to get container status \"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a\": rpc error: code = NotFound desc = could not find container \"e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a\": container with ID starting with e61cfd9a2b01983446a43651931726fee3673190ae6f2dbf6a8cbb0dc1f07d0a not found: ID does not exist" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.679761 4816 scope.go:117] "RemoveContainer" containerID="39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3" Mar 11 12:14:22 crc kubenswrapper[4816]: E0311 12:14:22.680170 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3\": container with ID starting with 39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3 not found: ID does not exist" containerID="39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3" Mar 11 12:14:22 crc kubenswrapper[4816]: I0311 12:14:22.680210 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3"} err="failed to get container status \"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3\": rpc error: code = NotFound desc = could not find container \"39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3\": container with ID starting with 39b076f96813251238b95cb5e0b4a9dd99b2aa58def497eb7a54a009755a64b3 not found: ID does not exist" Mar 11 12:14:24 crc kubenswrapper[4816]: I0311 12:14:24.142883 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a4785b-2b65-4a82-984d-611750e6e161" path="/var/lib/kubelet/pods/60a4785b-2b65-4a82-984d-611750e6e161/volumes" Mar 11 12:14:26 crc kubenswrapper[4816]: I0311 12:14:26.805036 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:26 crc kubenswrapper[4816]: I0311 12:14:26.805308 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:26 crc kubenswrapper[4816]: I0311 12:14:26.856692 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:27 crc kubenswrapper[4816]: I0311 12:14:27.263544 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zsrdm" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.115562 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp"] Mar 11 12:14:30 crc kubenswrapper[4816]: E0311 12:14:30.116183 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="extract-content" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.116196 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="extract-content" Mar 11 12:14:30 crc kubenswrapper[4816]: E0311 12:14:30.116218 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="registry-server" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.116224 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="registry-server" Mar 11 12:14:30 crc kubenswrapper[4816]: E0311 12:14:30.116240 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="extract-utilities" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.116280 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="extract-utilities" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.116408 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a4785b-2b65-4a82-984d-611750e6e161" containerName="registry-server" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.117236 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.120959 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-szh9h" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.141628 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp"] Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.199562 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.199620 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.199668 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.301236 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.301350 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.301462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.301727 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.302051 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.321734 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.436922 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:30 crc kubenswrapper[4816]: I0311 12:14:30.762686 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp"] Mar 11 12:14:31 crc kubenswrapper[4816]: I0311 12:14:31.256617 4816 generic.go:334] "Generic (PLEG): container finished" podID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerID="7181d533daaec28ea763a1ea9b634ea77a11c32e37954305eff24aceadd1e6f0" exitCode=0 Mar 11 12:14:31 crc kubenswrapper[4816]: I0311 12:14:31.256674 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerDied","Data":"7181d533daaec28ea763a1ea9b634ea77a11c32e37954305eff24aceadd1e6f0"} Mar 11 12:14:31 crc kubenswrapper[4816]: I0311 12:14:31.256743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerStarted","Data":"0f7ae4f3a64f0f17be7cd7cbfd6e70571aba9c54e9c7a8d2350c0a75f577adf3"} Mar 11 12:14:33 crc kubenswrapper[4816]: I0311 12:14:33.278215 4816 generic.go:334] "Generic (PLEG): container finished" podID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerID="0317c320a989f9d7e0d8233532a8d978d4493f5715d4fa8c266bef1645d03952" exitCode=0 Mar 11 12:14:33 crc kubenswrapper[4816]: I0311 12:14:33.278343 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerDied","Data":"0317c320a989f9d7e0d8233532a8d978d4493f5715d4fa8c266bef1645d03952"} Mar 11 12:14:34 crc kubenswrapper[4816]: I0311 12:14:34.288669 4816 generic.go:334] "Generic (PLEG): container finished" podID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerID="3829554e987ca45394aac167f62c69b8a8354a92f096cc5380d4c5403050c8ed" exitCode=0 Mar 11 12:14:34 crc kubenswrapper[4816]: I0311 12:14:34.288718 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerDied","Data":"3829554e987ca45394aac167f62c69b8a8354a92f096cc5380d4c5403050c8ed"} Mar 11 12:14:34 crc kubenswrapper[4816]: I0311 12:14:34.708411 4816 scope.go:117] "RemoveContainer" containerID="ce753e08f0c29759ce4abeca1c2ba4ffc8217be9eee018a375b073d4682d5231" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.590362 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696016 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") pod \"b89bbc79-4a51-434d-916c-bf02869be9cb\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696169 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") pod \"b89bbc79-4a51-434d-916c-bf02869be9cb\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696319 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") pod \"b89bbc79-4a51-434d-916c-bf02869be9cb\" (UID: \"b89bbc79-4a51-434d-916c-bf02869be9cb\") " Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696762 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle" (OuterVolumeSpecName: "bundle") pod "b89bbc79-4a51-434d-916c-bf02869be9cb" (UID: "b89bbc79-4a51-434d-916c-bf02869be9cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.696915 4816 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.701893 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647" (OuterVolumeSpecName: "kube-api-access-zc647") pod "b89bbc79-4a51-434d-916c-bf02869be9cb" (UID: "b89bbc79-4a51-434d-916c-bf02869be9cb"). InnerVolumeSpecName "kube-api-access-zc647". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.711610 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util" (OuterVolumeSpecName: "util") pod "b89bbc79-4a51-434d-916c-bf02869be9cb" (UID: "b89bbc79-4a51-434d-916c-bf02869be9cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.797901 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc647\" (UniqueName: \"kubernetes.io/projected/b89bbc79-4a51-434d-916c-bf02869be9cb-kube-api-access-zc647\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:35 crc kubenswrapper[4816]: I0311 12:14:35.797951 4816 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b89bbc79-4a51-434d-916c-bf02869be9cb-util\") on node \"crc\" DevicePath \"\"" Mar 11 12:14:36 crc kubenswrapper[4816]: E0311 12:14:36.226282 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89bbc79_4a51_434d_916c_bf02869be9cb.slice\": RecentStats: unable to find data in memory cache]" Mar 11 12:14:36 crc kubenswrapper[4816]: I0311 12:14:36.309486 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" event={"ID":"b89bbc79-4a51-434d-916c-bf02869be9cb","Type":"ContainerDied","Data":"0f7ae4f3a64f0f17be7cd7cbfd6e70571aba9c54e9c7a8d2350c0a75f577adf3"} Mar 11 12:14:36 crc kubenswrapper[4816]: I0311 12:14:36.309526 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f7ae4f3a64f0f17be7cd7cbfd6e70571aba9c54e9c7a8d2350c0a75f577adf3" Mar 11 12:14:36 crc kubenswrapper[4816]: I0311 12:14:36.310179 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp" Mar 11 12:14:39 crc kubenswrapper[4816]: I0311 12:14:39.515100 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:14:39 crc kubenswrapper[4816]: I0311 12:14:39.515508 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.271301 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl"] Mar 11 12:14:42 crc kubenswrapper[4816]: E0311 12:14:42.272353 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="util" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.272455 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="util" Mar 11 12:14:42 crc kubenswrapper[4816]: E0311 12:14:42.272536 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="pull" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.272618 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="pull" Mar 11 12:14:42 crc kubenswrapper[4816]: E0311 12:14:42.272738 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="extract" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.272823 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="extract" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.273039 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89bbc79-4a51-434d-916c-bf02869be9cb" containerName="extract" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.273617 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.276810 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-hv4gd" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.353498 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl"] Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.398453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b7lj\" (UniqueName: \"kubernetes.io/projected/0347df32-1ff0-463e-b073-077df8f41595-kube-api-access-9b7lj\") pod \"openstack-operator-controller-init-65b9994cf8-zz7rl\" (UID: \"0347df32-1ff0-463e-b073-077df8f41595\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.500116 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b7lj\" (UniqueName: \"kubernetes.io/projected/0347df32-1ff0-463e-b073-077df8f41595-kube-api-access-9b7lj\") pod \"openstack-operator-controller-init-65b9994cf8-zz7rl\" (UID: \"0347df32-1ff0-463e-b073-077df8f41595\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.520341 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b7lj\" (UniqueName: \"kubernetes.io/projected/0347df32-1ff0-463e-b073-077df8f41595-kube-api-access-9b7lj\") pod \"openstack-operator-controller-init-65b9994cf8-zz7rl\" (UID: \"0347df32-1ff0-463e-b073-077df8f41595\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:42 crc kubenswrapper[4816]: I0311 12:14:42.590695 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:43 crc kubenswrapper[4816]: I0311 12:14:43.045661 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl"] Mar 11 12:14:43 crc kubenswrapper[4816]: W0311 12:14:43.046776 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0347df32_1ff0_463e_b073_077df8f41595.slice/crio-2ca89b047ad118254203b9f9d0c2e99b5ef8214f1a22eab79f3ae66bebb3d17f WatchSource:0}: Error finding container 2ca89b047ad118254203b9f9d0c2e99b5ef8214f1a22eab79f3ae66bebb3d17f: Status 404 returned error can't find the container with id 2ca89b047ad118254203b9f9d0c2e99b5ef8214f1a22eab79f3ae66bebb3d17f Mar 11 12:14:43 crc kubenswrapper[4816]: I0311 12:14:43.354127 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" event={"ID":"0347df32-1ff0-463e-b073-077df8f41595","Type":"ContainerStarted","Data":"2ca89b047ad118254203b9f9d0c2e99b5ef8214f1a22eab79f3ae66bebb3d17f"} Mar 11 12:14:54 crc kubenswrapper[4816]: I0311 12:14:54.445368 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" event={"ID":"0347df32-1ff0-463e-b073-077df8f41595","Type":"ContainerStarted","Data":"1a2efe43b36e5cbc49391b95d744323354a32fe47b14aebca8806b630cc9062c"} Mar 11 12:14:54 crc kubenswrapper[4816]: I0311 12:14:54.446021 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:14:54 crc kubenswrapper[4816]: I0311 12:14:54.487949 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" podStartSLOduration=1.909275928 podStartE2EDuration="12.487931938s" podCreationTimestamp="2026-03-11 12:14:42 +0000 UTC" firstStartedPulling="2026-03-11 12:14:43.049315867 +0000 UTC m=+969.640579834" lastFinishedPulling="2026-03-11 12:14:53.627971827 +0000 UTC m=+980.219235844" observedRunningTime="2026-03-11 12:14:54.48558317 +0000 UTC m=+981.076847137" watchObservedRunningTime="2026-03-11 12:14:54.487931938 +0000 UTC m=+981.079195905" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.152537 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.154711 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.156964 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.157685 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.165050 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.297642 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.297715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.298258 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.401217 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.401382 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.401461 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.402892 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.416272 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.428493 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") pod \"collect-profiles-29553855-l4sqr\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.485745 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:00 crc kubenswrapper[4816]: I0311 12:15:00.712978 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 12:15:01 crc kubenswrapper[4816]: I0311 12:15:01.495853 4816 generic.go:334] "Generic (PLEG): container finished" podID="a876e965-7c6d-4773-9c9b-f445411c559b" containerID="f55a9848386a64adca827b95cdc172bd623f9f4d2757b50c73cba6bd74ab25e2" exitCode=0 Mar 11 12:15:01 crc kubenswrapper[4816]: I0311 12:15:01.495897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" event={"ID":"a876e965-7c6d-4773-9c9b-f445411c559b","Type":"ContainerDied","Data":"f55a9848386a64adca827b95cdc172bd623f9f4d2757b50c73cba6bd74ab25e2"} Mar 11 12:15:01 crc kubenswrapper[4816]: I0311 12:15:01.495925 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" event={"ID":"a876e965-7c6d-4773-9c9b-f445411c559b","Type":"ContainerStarted","Data":"b0d101521d949bdb643e3f587d74f6788adbc883ab03a4afcd19669250b4364c"} Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.595324 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-zz7rl" Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.793020 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.942546 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") pod \"a876e965-7c6d-4773-9c9b-f445411c559b\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.943396 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") pod \"a876e965-7c6d-4773-9c9b-f445411c559b\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.943556 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") pod \"a876e965-7c6d-4773-9c9b-f445411c559b\" (UID: \"a876e965-7c6d-4773-9c9b-f445411c559b\") " Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.944235 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume" (OuterVolumeSpecName: "config-volume") pod "a876e965-7c6d-4773-9c9b-f445411c559b" (UID: "a876e965-7c6d-4773-9c9b-f445411c559b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.950437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm" (OuterVolumeSpecName: "kube-api-access-lkstm") pod "a876e965-7c6d-4773-9c9b-f445411c559b" (UID: "a876e965-7c6d-4773-9c9b-f445411c559b"). InnerVolumeSpecName "kube-api-access-lkstm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:15:02 crc kubenswrapper[4816]: I0311 12:15:02.951899 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a876e965-7c6d-4773-9c9b-f445411c559b" (UID: "a876e965-7c6d-4773-9c9b-f445411c559b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.045819 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkstm\" (UniqueName: \"kubernetes.io/projected/a876e965-7c6d-4773-9c9b-f445411c559b-kube-api-access-lkstm\") on node \"crc\" DevicePath \"\"" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.045892 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a876e965-7c6d-4773-9c9b-f445411c559b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.045919 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a876e965-7c6d-4773-9c9b-f445411c559b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.512161 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" event={"ID":"a876e965-7c6d-4773-9c9b-f445411c559b","Type":"ContainerDied","Data":"b0d101521d949bdb643e3f587d74f6788adbc883ab03a4afcd19669250b4364c"} Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.512234 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0d101521d949bdb643e3f587d74f6788adbc883ab03a4afcd19669250b4364c" Mar 11 12:15:03 crc kubenswrapper[4816]: I0311 12:15:03.512308 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr" Mar 11 12:15:09 crc kubenswrapper[4816]: I0311 12:15:09.515285 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:15:09 crc kubenswrapper[4816]: I0311 12:15:09.515814 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.909489 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228"] Mar 11 12:15:21 crc kubenswrapper[4816]: E0311 12:15:21.910333 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a876e965-7c6d-4773-9c9b-f445411c559b" containerName="collect-profiles" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.910347 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a876e965-7c6d-4773-9c9b-f445411c559b" containerName="collect-profiles" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.910449 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a876e965-7c6d-4773-9c9b-f445411c559b" containerName="collect-profiles" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.910841 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.913897 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4cxnm" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.918281 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.919061 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.921465 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-sj8rq" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.923038 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.934749 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.937691 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.942923 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gg2r5" Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.966538 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.971920 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.983706 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm"] Mar 11 12:15:21 crc kubenswrapper[4816]: I0311 12:15:21.989373 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.009704 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4k7cn" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.020675 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.031059 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhngz\" (UniqueName: \"kubernetes.io/projected/a8133b64-eb11-43ad-bf6e-a278af0ff466-kube-api-access-xhngz\") pod \"barbican-operator-controller-manager-677bd678f7-rb228\" (UID: \"a8133b64-eb11-43ad-bf6e-a278af0ff466\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.031353 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv98c\" (UniqueName: \"kubernetes.io/projected/6311ca5f-6f4c-4768-ae5e-75128be7f589-kube-api-access-xv98c\") pod \"cinder-operator-controller-manager-984cd4dcf-g8cg2\" (UID: \"6311ca5f-6f4c-4768-ae5e-75128be7f589\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.044325 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.045359 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.048772 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xzvbx" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.052449 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.053369 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.054997 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rh87f" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.057389 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.067609 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.082864 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.083639 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.088040 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.089231 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vxmbd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.090962 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.113113 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.114043 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.118092 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7q6t8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.119105 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.120012 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.123674 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2vcjw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.123805 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.132819 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmxqs\" (UniqueName: \"kubernetes.io/projected/c28c6622-633e-4e76-9c9a-eb732531fa1a-kube-api-access-jmxqs\") pod \"glance-operator-controller-manager-5964f64c48-px2wm\" (UID: \"c28c6622-633e-4e76-9c9a-eb732531fa1a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.132864 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv98c\" (UniqueName: \"kubernetes.io/projected/6311ca5f-6f4c-4768-ae5e-75128be7f589-kube-api-access-xv98c\") pod \"cinder-operator-controller-manager-984cd4dcf-g8cg2\" (UID: \"6311ca5f-6f4c-4768-ae5e-75128be7f589\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.132913 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhngz\" (UniqueName: \"kubernetes.io/projected/a8133b64-eb11-43ad-bf6e-a278af0ff466-kube-api-access-xhngz\") pod \"barbican-operator-controller-manager-677bd678f7-rb228\" (UID: \"a8133b64-eb11-43ad-bf6e-a278af0ff466\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.132976 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szv9g\" (UniqueName: \"kubernetes.io/projected/72237264-5d09-40bd-ba83-f30b76790cb6-kube-api-access-szv9g\") pod \"designate-operator-controller-manager-66d56f6ff4-fjkn4\" (UID: \"72237264-5d09-40bd-ba83-f30b76790cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.161801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhngz\" (UniqueName: \"kubernetes.io/projected/a8133b64-eb11-43ad-bf6e-a278af0ff466-kube-api-access-xhngz\") pod \"barbican-operator-controller-manager-677bd678f7-rb228\" (UID: \"a8133b64-eb11-43ad-bf6e-a278af0ff466\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.165923 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.166792 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.166823 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.167469 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.168068 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.168175 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.168740 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.169072 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.171658 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-lng6s" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.171871 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-b4h4x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.175331 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.181459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv98c\" (UniqueName: \"kubernetes.io/projected/6311ca5f-6f4c-4768-ae5e-75128be7f589-kube-api-access-xv98c\") pod \"cinder-operator-controller-manager-984cd4dcf-g8cg2\" (UID: \"6311ca5f-6f4c-4768-ae5e-75128be7f589\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.181947 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vlwnm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.217081 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234124 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mskc2\" (UniqueName: \"kubernetes.io/projected/f37fb9b3-7b07-4188-b9ea-facfa5e945f0-kube-api-access-mskc2\") pod \"ironic-operator-controller-manager-6bbb499bbc-874hd\" (UID: \"f37fb9b3-7b07-4188-b9ea-facfa5e945f0\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234206 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ws7\" (UniqueName: \"kubernetes.io/projected/b941b0f1-4a8f-4517-af46-cc77892fe3d9-kube-api-access-v8ws7\") pod \"heat-operator-controller-manager-77b6666d85-66ctj\" (UID: \"b941b0f1-4a8f-4517-af46-cc77892fe3d9\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234276 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szv9g\" (UniqueName: \"kubernetes.io/projected/72237264-5d09-40bd-ba83-f30b76790cb6-kube-api-access-szv9g\") pod \"designate-operator-controller-manager-66d56f6ff4-fjkn4\" (UID: \"72237264-5d09-40bd-ba83-f30b76790cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234317 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmxqs\" (UniqueName: \"kubernetes.io/projected/c28c6622-633e-4e76-9c9a-eb732531fa1a-kube-api-access-jmxqs\") pod \"glance-operator-controller-manager-5964f64c48-px2wm\" (UID: \"c28c6622-633e-4e76-9c9a-eb732531fa1a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234352 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkqph\" (UniqueName: \"kubernetes.io/projected/9e0c8832-9c20-44a9-933c-4a7fff032367-kube-api-access-vkqph\") pod \"horizon-operator-controller-manager-6d9d6b584d-8v46x\" (UID: \"9e0c8832-9c20-44a9-933c-4a7fff032367\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234387 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhgkz\" (UniqueName: \"kubernetes.io/projected/73e00d02-6599-4cab-a32b-8fe96b82951a-kube-api-access-nhgkz\") pod \"keystone-operator-controller-manager-684f77d66d-zczdq\" (UID: \"73e00d02-6599-4cab-a32b-8fe96b82951a\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234408 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khvb9\" (UniqueName: \"kubernetes.io/projected/a605e964-6e3c-4639-95d5-908f5d0ab7ef-kube-api-access-khvb9\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.234433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.241807 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.242638 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.243603 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.248334 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4bdp8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.249095 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.257498 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.258431 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.260713 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-zwd24" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.277802 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szv9g\" (UniqueName: \"kubernetes.io/projected/72237264-5d09-40bd-ba83-f30b76790cb6-kube-api-access-szv9g\") pod \"designate-operator-controller-manager-66d56f6ff4-fjkn4\" (UID: \"72237264-5d09-40bd-ba83-f30b76790cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.279337 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.284140 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmxqs\" (UniqueName: \"kubernetes.io/projected/c28c6622-633e-4e76-9c9a-eb732531fa1a-kube-api-access-jmxqs\") pod \"glance-operator-controller-manager-5964f64c48-px2wm\" (UID: \"c28c6622-633e-4e76-9c9a-eb732531fa1a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.284493 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.286964 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.297182 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.298104 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.301445 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-mnct8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.333622 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.334671 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.334822 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.336483 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338042 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhjl\" (UniqueName: \"kubernetes.io/projected/5d318732-8194-49eb-a2a3-c5b13ce843a7-kube-api-access-5hhjl\") pod \"mariadb-operator-controller-manager-658d4cdd5-wnsst\" (UID: \"5d318732-8194-49eb-a2a3-c5b13ce843a7\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338068 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczr6\" (UniqueName: \"kubernetes.io/projected/bcfe1f90-2b5f-43b7-b798-0bad62ec53b2-kube-api-access-hczr6\") pod \"manila-operator-controller-manager-68f45f9d9f-bl9hm\" (UID: \"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338098 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ws7\" (UniqueName: \"kubernetes.io/projected/b941b0f1-4a8f-4517-af46-cc77892fe3d9-kube-api-access-v8ws7\") pod \"heat-operator-controller-manager-77b6666d85-66ctj\" (UID: \"b941b0f1-4a8f-4517-af46-cc77892fe3d9\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338131 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9sn\" (UniqueName: \"kubernetes.io/projected/b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5-kube-api-access-9b9sn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-56fsw\" (UID: \"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkqph\" (UniqueName: \"kubernetes.io/projected/9e0c8832-9c20-44a9-933c-4a7fff032367-kube-api-access-vkqph\") pod \"horizon-operator-controller-manager-6d9d6b584d-8v46x\" (UID: \"9e0c8832-9c20-44a9-933c-4a7fff032367\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338201 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhgkz\" (UniqueName: \"kubernetes.io/projected/73e00d02-6599-4cab-a32b-8fe96b82951a-kube-api-access-nhgkz\") pod \"keystone-operator-controller-manager-684f77d66d-zczdq\" (UID: \"73e00d02-6599-4cab-a32b-8fe96b82951a\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338221 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khvb9\" (UniqueName: \"kubernetes.io/projected/a605e964-6e3c-4639-95d5-908f5d0ab7ef-kube-api-access-khvb9\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338792 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dj6\" (UniqueName: \"kubernetes.io/projected/4d4c74ff-52a2-4426-bd06-daa6e9b1a832-kube-api-access-58dj6\") pod \"neutron-operator-controller-manager-776c5696bf-h2vmc\" (UID: \"4d4c74ff-52a2-4426-bd06-daa6e9b1a832\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.338860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mskc2\" (UniqueName: \"kubernetes.io/projected/f37fb9b3-7b07-4188-b9ea-facfa5e945f0-kube-api-access-mskc2\") pod \"ironic-operator-controller-manager-6bbb499bbc-874hd\" (UID: \"f37fb9b3-7b07-4188-b9ea-facfa5e945f0\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.338935 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.339017 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:22.838994391 +0000 UTC m=+1009.430258358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.343760 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.344019 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fps72" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.344325 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8r4xj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.353837 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.360085 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mskc2\" (UniqueName: \"kubernetes.io/projected/f37fb9b3-7b07-4188-b9ea-facfa5e945f0-kube-api-access-mskc2\") pod \"ironic-operator-controller-manager-6bbb499bbc-874hd\" (UID: \"f37fb9b3-7b07-4188-b9ea-facfa5e945f0\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.362695 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkqph\" (UniqueName: \"kubernetes.io/projected/9e0c8832-9c20-44a9-933c-4a7fff032367-kube-api-access-vkqph\") pod \"horizon-operator-controller-manager-6d9d6b584d-8v46x\" (UID: \"9e0c8832-9c20-44a9-933c-4a7fff032367\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.365077 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ws7\" (UniqueName: \"kubernetes.io/projected/b941b0f1-4a8f-4517-af46-cc77892fe3d9-kube-api-access-v8ws7\") pod \"heat-operator-controller-manager-77b6666d85-66ctj\" (UID: \"b941b0f1-4a8f-4517-af46-cc77892fe3d9\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.365179 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.368663 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhgkz\" (UniqueName: \"kubernetes.io/projected/73e00d02-6599-4cab-a32b-8fe96b82951a-kube-api-access-nhgkz\") pod \"keystone-operator-controller-manager-684f77d66d-zczdq\" (UID: \"73e00d02-6599-4cab-a32b-8fe96b82951a\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.368723 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.375848 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khvb9\" (UniqueName: \"kubernetes.io/projected/a605e964-6e3c-4639-95d5-908f5d0ab7ef-kube-api-access-khvb9\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.401316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.406985 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.432664 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.450744 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.453991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gztj\" (UniqueName: \"kubernetes.io/projected/d1702062-37ba-43c0-becb-005e11f457a0-kube-api-access-4gztj\") pod \"nova-operator-controller-manager-569cc54c5-rxhkb\" (UID: \"d1702062-37ba-43c0-becb-005e11f457a0\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454027 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dj6\" (UniqueName: \"kubernetes.io/projected/4d4c74ff-52a2-4426-bd06-daa6e9b1a832-kube-api-access-58dj6\") pod \"neutron-operator-controller-manager-776c5696bf-h2vmc\" (UID: \"4d4c74ff-52a2-4426-bd06-daa6e9b1a832\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454061 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhjl\" (UniqueName: \"kubernetes.io/projected/5d318732-8194-49eb-a2a3-c5b13ce843a7-kube-api-access-5hhjl\") pod \"mariadb-operator-controller-manager-658d4cdd5-wnsst\" (UID: \"5d318732-8194-49eb-a2a3-c5b13ce843a7\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454078 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczr6\" (UniqueName: \"kubernetes.io/projected/bcfe1f90-2b5f-43b7-b798-0bad62ec53b2-kube-api-access-hczr6\") pod \"manila-operator-controller-manager-68f45f9d9f-bl9hm\" (UID: \"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454111 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b9sn\" (UniqueName: \"kubernetes.io/projected/b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5-kube-api-access-9b9sn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-56fsw\" (UID: \"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454141 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9rq\" (UniqueName: \"kubernetes.io/projected/e04ad395-8120-4c57-8575-611fa438e8fb-kube-api-access-9h9rq\") pod \"placement-operator-controller-manager-574d45c66c-h7kgb\" (UID: \"e04ad395-8120-4c57-8575-611fa438e8fb\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454169 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnvxf\" (UniqueName: \"kubernetes.io/projected/78a7aebd-70a2-4608-a669-aea496cb6186-kube-api-access-pnvxf\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454189 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.454219 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s9x8\" (UniqueName: \"kubernetes.io/projected/6bbceab2-fe2b-4693-867d-aa2a51261611-kube-api-access-2s9x8\") pod \"ovn-operator-controller-manager-bbc5b68f9-rr62t\" (UID: \"6bbceab2-fe2b-4693-867d-aa2a51261611\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.458439 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-426qz"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.459323 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.461636 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.474766 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-cm6q7" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.489017 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dj6\" (UniqueName: \"kubernetes.io/projected/4d4c74ff-52a2-4426-bd06-daa6e9b1a832-kube-api-access-58dj6\") pod \"neutron-operator-controller-manager-776c5696bf-h2vmc\" (UID: \"4d4c74ff-52a2-4426-bd06-daa6e9b1a832\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.514745 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b9sn\" (UniqueName: \"kubernetes.io/projected/b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5-kube-api-access-9b9sn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-56fsw\" (UID: \"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.515020 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhjl\" (UniqueName: \"kubernetes.io/projected/5d318732-8194-49eb-a2a3-c5b13ce843a7-kube-api-access-5hhjl\") pod \"mariadb-operator-controller-manager-658d4cdd5-wnsst\" (UID: \"5d318732-8194-49eb-a2a3-c5b13ce843a7\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.516511 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-426qz"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.519046 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczr6\" (UniqueName: \"kubernetes.io/projected/bcfe1f90-2b5f-43b7-b798-0bad62ec53b2-kube-api-access-hczr6\") pod \"manila-operator-controller-manager-68f45f9d9f-bl9hm\" (UID: \"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.533678 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.552632 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.553857 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555277 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9rq\" (UniqueName: \"kubernetes.io/projected/e04ad395-8120-4c57-8575-611fa438e8fb-kube-api-access-9h9rq\") pod \"placement-operator-controller-manager-574d45c66c-h7kgb\" (UID: \"e04ad395-8120-4c57-8575-611fa438e8fb\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555335 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnvxf\" (UniqueName: \"kubernetes.io/projected/78a7aebd-70a2-4608-a669-aea496cb6186-kube-api-access-pnvxf\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555364 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555401 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s9x8\" (UniqueName: \"kubernetes.io/projected/6bbceab2-fe2b-4693-867d-aa2a51261611-kube-api-access-2s9x8\") pod \"ovn-operator-controller-manager-bbc5b68f9-rr62t\" (UID: \"6bbceab2-fe2b-4693-867d-aa2a51261611\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.555446 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gztj\" (UniqueName: \"kubernetes.io/projected/d1702062-37ba-43c0-becb-005e11f457a0-kube-api-access-4gztj\") pod \"nova-operator-controller-manager-569cc54c5-rxhkb\" (UID: \"d1702062-37ba-43c0-becb-005e11f457a0\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.556215 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.556278 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:23.056260493 +0000 UTC m=+1009.647524460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.559472 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7f9j2" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.584544 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9rq\" (UniqueName: \"kubernetes.io/projected/e04ad395-8120-4c57-8575-611fa438e8fb-kube-api-access-9h9rq\") pod \"placement-operator-controller-manager-574d45c66c-h7kgb\" (UID: \"e04ad395-8120-4c57-8575-611fa438e8fb\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.590702 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnvxf\" (UniqueName: \"kubernetes.io/projected/78a7aebd-70a2-4608-a669-aea496cb6186-kube-api-access-pnvxf\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.592848 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gztj\" (UniqueName: \"kubernetes.io/projected/d1702062-37ba-43c0-becb-005e11f457a0-kube-api-access-4gztj\") pod \"nova-operator-controller-manager-569cc54c5-rxhkb\" (UID: \"d1702062-37ba-43c0-becb-005e11f457a0\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.593574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s9x8\" (UniqueName: \"kubernetes.io/projected/6bbceab2-fe2b-4693-867d-aa2a51261611-kube-api-access-2s9x8\") pod \"ovn-operator-controller-manager-bbc5b68f9-rr62t\" (UID: \"6bbceab2-fe2b-4693-867d-aa2a51261611\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.599118 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.654055 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.664494 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.667425 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.663023 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9n7\" (UniqueName: \"kubernetes.io/projected/d7932403-615f-44e4-b195-4a83c19787ba-kube-api-access-4m9n7\") pod \"swift-operator-controller-manager-677c674df7-426qz\" (UID: \"d7932403-615f-44e4-b195-4a83c19787ba\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.667934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxcrf\" (UniqueName: \"kubernetes.io/projected/0ddf91ff-6d91-4213-8032-05f80408063d-kube-api-access-mxcrf\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7ldx8\" (UID: \"0ddf91ff-6d91-4213-8032-05f80408063d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.668556 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.669734 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zmq6m" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.675601 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.691155 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.710327 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.718568 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.719735 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.722953 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vx4jk" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.735038 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.739543 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.761578 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.769074 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2b8h\" (UniqueName: \"kubernetes.io/projected/4126be7d-7ca8-4e68-94d4-ea21644fbd85-kube-api-access-q2b8h\") pod \"watcher-operator-controller-manager-6dd88c6f67-kx9nz\" (UID: \"4126be7d-7ca8-4e68-94d4-ea21644fbd85\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.769183 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9n7\" (UniqueName: \"kubernetes.io/projected/d7932403-615f-44e4-b195-4a83c19787ba-kube-api-access-4m9n7\") pod \"swift-operator-controller-manager-677c674df7-426qz\" (UID: \"d7932403-615f-44e4-b195-4a83c19787ba\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.769222 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxcrf\" (UniqueName: \"kubernetes.io/projected/0ddf91ff-6d91-4213-8032-05f80408063d-kube-api-access-mxcrf\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7ldx8\" (UID: \"0ddf91ff-6d91-4213-8032-05f80408063d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.769326 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgnwd\" (UniqueName: \"kubernetes.io/projected/282f8f05-9a84-4bb4-a122-ba8806324ca3-kube-api-access-sgnwd\") pod \"test-operator-controller-manager-5c5cb9c4d7-k2rnj\" (UID: \"282f8f05-9a84-4bb4-a122-ba8806324ca3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.784977 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.785853 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.789215 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxcrf\" (UniqueName: \"kubernetes.io/projected/0ddf91ff-6d91-4213-8032-05f80408063d-kube-api-access-mxcrf\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7ldx8\" (UID: \"0ddf91ff-6d91-4213-8032-05f80408063d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.789540 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vwbwj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.789766 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.790015 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.801278 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.809895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9n7\" (UniqueName: \"kubernetes.io/projected/d7932403-615f-44e4-b195-4a83c19787ba-kube-api-access-4m9n7\") pod \"swift-operator-controller-manager-677c674df7-426qz\" (UID: \"d7932403-615f-44e4-b195-4a83c19787ba\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.819026 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.819985 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.823823 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf"] Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.824493 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-l88nf" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.824590 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.845241 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.872621 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.872682 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.872746 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kcq2\" (UniqueName: \"kubernetes.io/projected/5f4b0b09-5704-432a-9cd4-82a296f3c467-kube-api-access-4kcq2\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.879058 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgnwd\" (UniqueName: \"kubernetes.io/projected/282f8f05-9a84-4bb4-a122-ba8806324ca3-kube-api-access-sgnwd\") pod \"test-operator-controller-manager-5c5cb9c4d7-k2rnj\" (UID: \"282f8f05-9a84-4bb4-a122-ba8806324ca3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.879295 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j895\" (UniqueName: \"kubernetes.io/projected/8e810ef6-d3f5-4133-bce2-234df32b3d10-kube-api-access-8j895\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dnqpf\" (UID: \"8e810ef6-d3f5-4133-bce2-234df32b3d10\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.879410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2b8h\" (UniqueName: \"kubernetes.io/projected/4126be7d-7ca8-4e68-94d4-ea21644fbd85-kube-api-access-q2b8h\") pod \"watcher-operator-controller-manager-6dd88c6f67-kx9nz\" (UID: \"4126be7d-7ca8-4e68-94d4-ea21644fbd85\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.879491 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.879814 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.879920 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:23.879900864 +0000 UTC m=+1010.471164831 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.922130 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2b8h\" (UniqueName: \"kubernetes.io/projected/4126be7d-7ca8-4e68-94d4-ea21644fbd85-kube-api-access-q2b8h\") pod \"watcher-operator-controller-manager-6dd88c6f67-kx9nz\" (UID: \"4126be7d-7ca8-4e68-94d4-ea21644fbd85\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.923865 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgnwd\" (UniqueName: \"kubernetes.io/projected/282f8f05-9a84-4bb4-a122-ba8806324ca3-kube-api-access-sgnwd\") pod \"test-operator-controller-manager-5c5cb9c4d7-k2rnj\" (UID: \"282f8f05-9a84-4bb4-a122-ba8806324ca3\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.983073 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j895\" (UniqueName: \"kubernetes.io/projected/8e810ef6-d3f5-4133-bce2-234df32b3d10-kube-api-access-8j895\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dnqpf\" (UID: \"8e810ef6-d3f5-4133-bce2-234df32b3d10\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.983206 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.983237 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: I0311 12:15:22.983316 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kcq2\" (UniqueName: \"kubernetes.io/projected/5f4b0b09-5704-432a-9cd4-82a296f3c467-kube-api-access-4kcq2\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.983461 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.983546 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:23.483522024 +0000 UTC m=+1010.074785991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.983725 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:22 crc kubenswrapper[4816]: E0311 12:15:22.983789 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:23.483769111 +0000 UTC m=+1010.075033178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.008550 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j895\" (UniqueName: \"kubernetes.io/projected/8e810ef6-d3f5-4133-bce2-234df32b3d10-kube-api-access-8j895\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dnqpf\" (UID: \"8e810ef6-d3f5-4133-bce2-234df32b3d10\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.034160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kcq2\" (UniqueName: \"kubernetes.io/projected/5f4b0b09-5704-432a-9cd4-82a296f3c467-kube-api-access-4kcq2\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.059734 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.087424 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.087581 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.087648 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:24.087627898 +0000 UTC m=+1010.678891865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.153462 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.209902 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.261568 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2"] Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.497392 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.497437 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.497599 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.497644 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.497703 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:24.497679821 +0000 UTC m=+1011.088943878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.497721 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:24.497715572 +0000 UTC m=+1011.088979539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.676013 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" event={"ID":"6311ca5f-6f4c-4768-ae5e-75128be7f589","Type":"ContainerStarted","Data":"42f76418da9ac1dd9f3d67f38f034aad5fbc9040ac8a8efb960cc9e438b73c1b"} Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.702567 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm"] Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.742881 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228"] Mar 11 12:15:23 crc kubenswrapper[4816]: W0311 12:15:23.747224 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73e00d02_6599_4cab_a32b_8fe96b82951a.slice/crio-816296e43888e78972adac3a0612fc0bc7ecda9b63b064b7f2079b767a8333a6 WatchSource:0}: Error finding container 816296e43888e78972adac3a0612fc0bc7ecda9b63b064b7f2079b767a8333a6: Status 404 returned error can't find the container with id 816296e43888e78972adac3a0612fc0bc7ecda9b63b064b7f2079b767a8333a6 Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.751547 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj"] Mar 11 12:15:23 crc kubenswrapper[4816]: W0311 12:15:23.754263 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8133b64_eb11_43ad_bf6e_a278af0ff466.slice/crio-7c0edc922c0f570ddf7e4782d583c20d7a06f05d79e9d1d10972db14503d209a WatchSource:0}: Error finding container 7c0edc922c0f570ddf7e4782d583c20d7a06f05d79e9d1d10972db14503d209a: Status 404 returned error can't find the container with id 7c0edc922c0f570ddf7e4782d583c20d7a06f05d79e9d1d10972db14503d209a Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.765057 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq"] Mar 11 12:15:23 crc kubenswrapper[4816]: W0311 12:15:23.768832 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e0c8832_9c20_44a9_933c_4a7fff032367.slice/crio-ee1374418873ee1ef0528d7e6e082f837d5a907a0a7537ec81d120ffb7351a25 WatchSource:0}: Error finding container ee1374418873ee1ef0528d7e6e082f837d5a907a0a7537ec81d120ffb7351a25: Status 404 returned error can't find the container with id ee1374418873ee1ef0528d7e6e082f837d5a907a0a7537ec81d120ffb7351a25 Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.781711 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x"] Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.788564 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4"] Mar 11 12:15:23 crc kubenswrapper[4816]: I0311 12:15:23.908083 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.908306 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:23 crc kubenswrapper[4816]: E0311 12:15:23.908371 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:25.908342691 +0000 UTC m=+1012.499606658 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.007634 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.102583 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.111890 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.112088 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.112151 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:26.112137461 +0000 UTC m=+1012.703401428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.128712 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.149983 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sgnwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-k2rnj_openstack-operators(282f8f05-9a84-4bb4-a122-ba8806324ca3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.151080 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" podUID="282f8f05-9a84-4bb4-a122-ba8806324ca3" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.155017 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.162197 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hhjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-658d4cdd5-wnsst_openstack-operators(5d318732-8194-49eb-a2a3-c5b13ce843a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.164164 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" podUID="5d318732-8194-49eb-a2a3-c5b13ce843a7" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.167592 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.184396 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s9x8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-rr62t_openstack-operators(6bbceab2-fe2b-4693-867d-aa2a51261611): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.185533 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" podUID="6bbceab2-fe2b-4693-867d-aa2a51261611" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.188528 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4m9n7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-426qz_openstack-operators(d7932403-615f-44e4-b195-4a83c19787ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.191072 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" podUID="d7932403-615f-44e4-b195-4a83c19787ba" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.194355 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.204664 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.217201 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxcrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-7ldx8_openstack-operators(0ddf91ff-6d91-4213-8032-05f80408063d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.219496 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" podUID="0ddf91ff-6d91-4213-8032-05f80408063d" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.225938 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst"] Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.233119 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4gztj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-569cc54c5-rxhkb_openstack-operators(d1702062-37ba-43c0-becb-005e11f457a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.234444 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" podUID="d1702062-37ba-43c0-becb-005e11f457a0" Mar 11 12:15:24 crc kubenswrapper[4816]: W0311 12:15:24.247062 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4126be7d_7ca8_4e68_94d4_ea21644fbd85.slice/crio-103e18b3f96a7836490015b2ceb6df583489c99c24580345d3b7e471ea1806f6 WatchSource:0}: Error finding container 103e18b3f96a7836490015b2ceb6df583489c99c24580345d3b7e471ea1806f6: Status 404 returned error can't find the container with id 103e18b3f96a7836490015b2ceb6df583489c99c24580345d3b7e471ea1806f6 Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.248706 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q2b8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-kx9nz_openstack-operators(4126be7d-7ca8-4e68-94d4-ea21644fbd85): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.249945 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" podUID="4126be7d-7ca8-4e68-94d4-ea21644fbd85" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.255765 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-426qz"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.262294 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.284217 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.290734 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.303868 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz"] Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.516678 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.516716 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.516907 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.516948 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:26.516935761 +0000 UTC m=+1013.108199728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.517273 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.517299 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:26.517291941 +0000 UTC m=+1013.108555908 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.693728 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" event={"ID":"4126be7d-7ca8-4e68-94d4-ea21644fbd85","Type":"ContainerStarted","Data":"103e18b3f96a7836490015b2ceb6df583489c99c24580345d3b7e471ea1806f6"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.695664 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" podUID="4126be7d-7ca8-4e68-94d4-ea21644fbd85" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.700403 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" event={"ID":"e04ad395-8120-4c57-8575-611fa438e8fb","Type":"ContainerStarted","Data":"715e3d19002d44a4329be61357bbba9352a154017d1233e9cda5ae9c4fdd9256"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.702166 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" event={"ID":"8e810ef6-d3f5-4133-bce2-234df32b3d10","Type":"ContainerStarted","Data":"f8c717263b0e9d2e0f927e56f7842d64fa39521a9a93cac24913b3dc0e2bbb4a"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.704725 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" event={"ID":"a8133b64-eb11-43ad-bf6e-a278af0ff466","Type":"ContainerStarted","Data":"7c0edc922c0f570ddf7e4782d583c20d7a06f05d79e9d1d10972db14503d209a"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.707084 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" event={"ID":"b941b0f1-4a8f-4517-af46-cc77892fe3d9","Type":"ContainerStarted","Data":"275604cb58d94400d609852fdcfcbe587b6108308bb7f6979d39e1543e6a9201"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.712913 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" event={"ID":"d1702062-37ba-43c0-becb-005e11f457a0","Type":"ContainerStarted","Data":"a1c332a8f9f06d000e09778e625d18d473748e76952ca22651d101f601022583"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.715845 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" podUID="d1702062-37ba-43c0-becb-005e11f457a0" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.721658 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" event={"ID":"6bbceab2-fe2b-4693-867d-aa2a51261611","Type":"ContainerStarted","Data":"303c0f9283e81aeac2e6d6ab0e4ac3deefc1337a027f284b6b5e1e71c0f2e3d0"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.724171 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" podUID="6bbceab2-fe2b-4693-867d-aa2a51261611" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.727572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" event={"ID":"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2","Type":"ContainerStarted","Data":"3eea47678ede611c1001e6a7147d6e4c48fe5edb7f89235bfe1984abac1fb4e8"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.739985 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" event={"ID":"9e0c8832-9c20-44a9-933c-4a7fff032367","Type":"ContainerStarted","Data":"ee1374418873ee1ef0528d7e6e082f837d5a907a0a7537ec81d120ffb7351a25"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.743390 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" event={"ID":"73e00d02-6599-4cab-a32b-8fe96b82951a","Type":"ContainerStarted","Data":"816296e43888e78972adac3a0612fc0bc7ecda9b63b064b7f2079b767a8333a6"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.747483 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" event={"ID":"d7932403-615f-44e4-b195-4a83c19787ba","Type":"ContainerStarted","Data":"f10441d7d93b347d0276856ef7fdf2b6fed8a2030f74252fcac16c7a4aa73254"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.749144 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" podUID="d7932403-615f-44e4-b195-4a83c19787ba" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.750115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" event={"ID":"282f8f05-9a84-4bb4-a122-ba8806324ca3","Type":"ContainerStarted","Data":"f3332faccfc6d1c0624878dd32cf7b5036ae673b0b5f6a4282615ba11799463b"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.751363 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" podUID="282f8f05-9a84-4bb4-a122-ba8806324ca3" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.752875 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" event={"ID":"f37fb9b3-7b07-4188-b9ea-facfa5e945f0","Type":"ContainerStarted","Data":"960f1ff05d90dddfdf3807e6f61172bfd8cdcf0879ca635bac9a60c26d2fc27a"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.758019 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" event={"ID":"4d4c74ff-52a2-4426-bd06-daa6e9b1a832","Type":"ContainerStarted","Data":"1d9ba18a437f5f176c6134140198f809d2744bc517a548e8d0f0abe1739bccbd"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.763468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" event={"ID":"c28c6622-633e-4e76-9c9a-eb732531fa1a","Type":"ContainerStarted","Data":"c72e7e43d63a0638bc9f93a540d4759ca3542365a1e02bdfa35caf9bb41150ef"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.765270 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" event={"ID":"0ddf91ff-6d91-4213-8032-05f80408063d","Type":"ContainerStarted","Data":"351d413bf7d307a846b7ee3c09d2f3de72f144af8aee0b96e7aead7cb3ad5f6a"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.768350 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" podUID="0ddf91ff-6d91-4213-8032-05f80408063d" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.771994 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" event={"ID":"5d318732-8194-49eb-a2a3-c5b13ce843a7","Type":"ContainerStarted","Data":"4df0eb5dea2ea13c7cee2efe31de48244b2d58a8ae118dd26ce26d06cd4001ab"} Mar 11 12:15:24 crc kubenswrapper[4816]: E0311 12:15:24.774444 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" podUID="5d318732-8194-49eb-a2a3-c5b13ce843a7" Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.784000 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" event={"ID":"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5","Type":"ContainerStarted","Data":"5958551027c88a2fe056b2a53c574ce8e91ed926acaca7d47471f3fb901d2d49"} Mar 11 12:15:24 crc kubenswrapper[4816]: I0311 12:15:24.787201 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" event={"ID":"72237264-5d09-40bd-ba83-f30b76790cb6","Type":"ContainerStarted","Data":"2e2820d9861887138ed0039a8cd963a27c345037072d95c861f80f47f028fbbf"} Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.807426 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" podUID="d7932403-615f-44e4-b195-4a83c19787ba" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.807943 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" podUID="282f8f05-9a84-4bb4-a122-ba8806324ca3" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.808605 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" podUID="6bbceab2-fe2b-4693-867d-aa2a51261611" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.808686 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" podUID="4126be7d-7ca8-4e68-94d4-ea21644fbd85" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.808906 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" podUID="0ddf91ff-6d91-4213-8032-05f80408063d" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.809601 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" podUID="5d318732-8194-49eb-a2a3-c5b13ce843a7" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.811723 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" podUID="d1702062-37ba-43c0-becb-005e11f457a0" Mar 11 12:15:25 crc kubenswrapper[4816]: I0311 12:15:25.961990 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.962169 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:25 crc kubenswrapper[4816]: E0311 12:15:25.962237 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:29.962216137 +0000 UTC m=+1016.553480104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: I0311 12:15:26.172978 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.173129 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.173180 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:30.173165815 +0000 UTC m=+1016.764429782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: I0311 12:15:26.586184 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:26 crc kubenswrapper[4816]: I0311 12:15:26.586234 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.586361 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.586401 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.586443 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:30.58642246 +0000 UTC m=+1017.177686417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:26 crc kubenswrapper[4816]: E0311 12:15:26.586462 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:30.586454631 +0000 UTC m=+1017.177718598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: I0311 12:15:30.038472 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.038657 4816 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.039246 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert podName:a605e964-6e3c-4639-95d5-908f5d0ab7ef nodeName:}" failed. No retries permitted until 2026-03-11 12:15:38.039227695 +0000 UTC m=+1024.630491662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert") pod "infra-operator-controller-manager-5995f4446f-hzd9q" (UID: "a605e964-6e3c-4639-95d5-908f5d0ab7ef") : secret "infra-operator-webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: I0311 12:15:30.242411 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.243467 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.243507 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:38.243493799 +0000 UTC m=+1024.834757766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: I0311 12:15:30.650848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:30 crc kubenswrapper[4816]: I0311 12:15:30.650896 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.651065 4816 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.651072 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.651123 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:38.65110797 +0000 UTC m=+1025.242371937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "metrics-server-cert" not found Mar 11 12:15:30 crc kubenswrapper[4816]: E0311 12:15:30.651151 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:38.651132841 +0000 UTC m=+1025.242396808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:36 crc kubenswrapper[4816]: E0311 12:15:36.373231 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 11 12:15:36 crc kubenswrapper[4816]: E0311 12:15:36.373907 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-58dj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-h2vmc_openstack-operators(4d4c74ff-52a2-4426-bd06-daa6e9b1a832): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:15:36 crc kubenswrapper[4816]: E0311 12:15:36.375310 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" podUID="4d4c74ff-52a2-4426-bd06-daa6e9b1a832" Mar 11 12:15:36 crc kubenswrapper[4816]: E0311 12:15:36.908957 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" podUID="4d4c74ff-52a2-4426-bd06-daa6e9b1a832" Mar 11 12:15:37 crc kubenswrapper[4816]: E0311 12:15:37.041545 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571" Mar 11 12:15:37 crc kubenswrapper[4816]: E0311 12:15:37.041758 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9b9sn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-56fsw_openstack-operators(b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:15:37 crc kubenswrapper[4816]: E0311 12:15:37.042865 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" podUID="b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5" Mar 11 12:15:37 crc kubenswrapper[4816]: E0311 12:15:37.915637 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" podUID="b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.075893 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.085177 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a605e964-6e3c-4639-95d5-908f5d0ab7ef-cert\") pod \"infra-operator-controller-manager-5995f4446f-hzd9q\" (UID: \"a605e964-6e3c-4639-95d5-908f5d0ab7ef\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.278601 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:38 crc kubenswrapper[4816]: E0311 12:15:38.278834 4816 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:38 crc kubenswrapper[4816]: E0311 12:15:38.279461 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert podName:78a7aebd-70a2-4608-a669-aea496cb6186 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:54.279436905 +0000 UTC m=+1040.870700872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" (UID: "78a7aebd-70a2-4608-a669-aea496cb6186") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.321345 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vxmbd" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.330433 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.685150 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.685208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:38 crc kubenswrapper[4816]: E0311 12:15:38.685314 4816 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 12:15:38 crc kubenswrapper[4816]: E0311 12:15:38.685379 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs podName:5f4b0b09-5704-432a-9cd4-82a296f3c467 nodeName:}" failed. No retries permitted until 2026-03-11 12:15:54.685360678 +0000 UTC m=+1041.276624645 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-pt8n6" (UID: "5f4b0b09-5704-432a-9cd4-82a296f3c467") : secret "webhook-server-cert" not found Mar 11 12:15:38 crc kubenswrapper[4816]: I0311 12:15:38.697552 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.388557 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.389408 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nhgkz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-zczdq_openstack-operators(73e00d02-6599-4cab-a32b-8fe96b82951a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.390549 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" podUID="73e00d02-6599-4cab-a32b-8fe96b82951a" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.515446 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.515501 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.515538 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.515952 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.516006 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507" gracePeriod=600 Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.882329 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.882614 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8j895,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-dnqpf_openstack-operators(8e810ef6-d3f5-4133-bce2-234df32b3d10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.883901 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" podUID="8e810ef6-d3f5-4133-bce2-234df32b3d10" Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.939317 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507" exitCode=0 Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.939381 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507"} Mar 11 12:15:39 crc kubenswrapper[4816]: I0311 12:15:39.939487 4816 scope.go:117] "RemoveContainer" containerID="45ccbed932001dc629a77de7e08e04a9cce25a78ac1e00aed407f7f4e1fa93a3" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.946388 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" podUID="73e00d02-6599-4cab-a32b-8fe96b82951a" Mar 11 12:15:39 crc kubenswrapper[4816]: E0311 12:15:39.946502 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" podUID="8e810ef6-d3f5-4133-bce2-234df32b3d10" Mar 11 12:15:40 crc kubenswrapper[4816]: I0311 12:15:40.278193 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q"] Mar 11 12:15:41 crc kubenswrapper[4816]: W0311 12:15:41.711024 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda605e964_6e3c_4639_95d5_908f5d0ab7ef.slice/crio-feb5aeba86168a60d262043807101ac42fc06cd533d933f3e3c564d62c04538d WatchSource:0}: Error finding container feb5aeba86168a60d262043807101ac42fc06cd533d933f3e3c564d62c04538d: Status 404 returned error can't find the container with id feb5aeba86168a60d262043807101ac42fc06cd533d933f3e3c564d62c04538d Mar 11 12:15:41 crc kubenswrapper[4816]: I0311 12:15:41.962601 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" event={"ID":"a605e964-6e3c-4639-95d5-908f5d0ab7ef","Type":"ContainerStarted","Data":"feb5aeba86168a60d262043807101ac42fc06cd533d933f3e3c564d62c04538d"} Mar 11 12:15:43 crc kubenswrapper[4816]: I0311 12:15:43.982299 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" event={"ID":"c28c6622-633e-4e76-9c9a-eb732531fa1a","Type":"ContainerStarted","Data":"c8430addf987a22f8b7f9cc01817da9117aa40bc7cc4e4054f46452d139bc56d"} Mar 11 12:15:43 crc kubenswrapper[4816]: I0311 12:15:43.982658 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:43 crc kubenswrapper[4816]: I0311 12:15:43.984468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96"} Mar 11 12:15:44 crc kubenswrapper[4816]: I0311 12:15:44.004821 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" podStartSLOduration=6.848546235 podStartE2EDuration="23.00479847s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.706917619 +0000 UTC m=+1010.298181586" lastFinishedPulling="2026-03-11 12:15:39.863169854 +0000 UTC m=+1026.454433821" observedRunningTime="2026-03-11 12:15:43.995681105 +0000 UTC m=+1030.586945072" watchObservedRunningTime="2026-03-11 12:15:44.00479847 +0000 UTC m=+1030.596062437" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.010278 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" event={"ID":"5d318732-8194-49eb-a2a3-c5b13ce843a7","Type":"ContainerStarted","Data":"abbfd80f58229b6d72ae7f8d78676a114f0f8da1cd2b73f61e3e29cae274284e"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.011996 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.013549 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" event={"ID":"72237264-5d09-40bd-ba83-f30b76790cb6","Type":"ContainerStarted","Data":"942341a5281631241462d79a359f97f11ed7566ccb1699d0364551d4d5ef5be0"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.013964 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.015824 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" event={"ID":"282f8f05-9a84-4bb4-a122-ba8806324ca3","Type":"ContainerStarted","Data":"5d515bb111e24b12c705fd89c78f2c56f37c61e64977287b82d743ee6f0d6fcf"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.016427 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.017729 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" event={"ID":"4126be7d-7ca8-4e68-94d4-ea21644fbd85","Type":"ContainerStarted","Data":"2b476045814439037242972d218641bae9ea40b03ddb8c8b0d4749bffcc6a4d0"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.018130 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.019332 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" event={"ID":"d7932403-615f-44e4-b195-4a83c19787ba","Type":"ContainerStarted","Data":"04b4f259d59a13b69141b50f407751c9a85a711831daa0770f651c45421e2ed1"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.019478 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.020339 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" event={"ID":"b941b0f1-4a8f-4517-af46-cc77892fe3d9","Type":"ContainerStarted","Data":"659931bfb6136077eba8a1ce9f1ad5c42f7a0daa2b4d85ec12342fe8e643d9a1"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.020682 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.022600 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" event={"ID":"6bbceab2-fe2b-4693-867d-aa2a51261611","Type":"ContainerStarted","Data":"cfaaefd9338b4d804d9d444e4f30c8446fc9961809ff5cff4353ee48c602e8b0"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.022765 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.024121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" event={"ID":"0ddf91ff-6d91-4213-8032-05f80408063d","Type":"ContainerStarted","Data":"dd5fef126a8fd8da0671e4f30b167fffe0b57b171babb991338633f629165ad9"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.024568 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.025897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" event={"ID":"9e0c8832-9c20-44a9-933c-4a7fff032367","Type":"ContainerStarted","Data":"691fface6b191965f93bb720406f44bbae035a07c03ac4459acb5d5a0c9b2faf"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.026261 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.027334 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" event={"ID":"a8133b64-eb11-43ad-bf6e-a278af0ff466","Type":"ContainerStarted","Data":"d419c87fd8a27b5e0600fdd812f665f40abb8b3227009bcd8a1a0f3bc3f08690"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.027707 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.028678 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" event={"ID":"a605e964-6e3c-4639-95d5-908f5d0ab7ef","Type":"ContainerStarted","Data":"b51333b0db15e7eddfdf1012e14dd0d99fc098612ec285dc040a9637da6c5c69"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.029070 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.032441 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" event={"ID":"f37fb9b3-7b07-4188-b9ea-facfa5e945f0","Type":"ContainerStarted","Data":"3925199c7896620f3767b243b460c1bd6950fae8f4b788cafd55b81e4d76d5ec"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.032959 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.035749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" event={"ID":"bcfe1f90-2b5f-43b7-b798-0bad62ec53b2","Type":"ContainerStarted","Data":"0fd530b9837f3faa33a1e00ce94b6d75a468faab7a778abf88098233d30b4597"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.036040 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.040758 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" event={"ID":"6311ca5f-6f4c-4768-ae5e-75128be7f589","Type":"ContainerStarted","Data":"a47d527368a9a736ee8d9a5f49880bf1618c7aeef362a98017cef5b3e1d3d239"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.040897 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.047194 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" event={"ID":"d1702062-37ba-43c0-becb-005e11f457a0","Type":"ContainerStarted","Data":"9742b5fc0c9cc8109db8e87af51553670053bcde78357c7550eeee43432ad9f9"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.047885 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.048959 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" event={"ID":"e04ad395-8120-4c57-8575-611fa438e8fb","Type":"ContainerStarted","Data":"f1807fc6bfaf31ecad9a00ec9b1dc5ed0820d9d578634c1d64b844e0387a9479"} Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.049374 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.163201 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" podStartSLOduration=9.072622486 podStartE2EDuration="25.163182152s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.775009577 +0000 UTC m=+1010.366273544" lastFinishedPulling="2026-03-11 12:15:39.865569243 +0000 UTC m=+1026.456833210" observedRunningTime="2026-03-11 12:15:47.160270297 +0000 UTC m=+1033.751534264" watchObservedRunningTime="2026-03-11 12:15:47.163182152 +0000 UTC m=+1033.754446119" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.164073 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" podStartSLOduration=3.279869286 podStartE2EDuration="25.164067068s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.162064202 +0000 UTC m=+1010.753328169" lastFinishedPulling="2026-03-11 12:15:46.046261984 +0000 UTC m=+1032.637525951" observedRunningTime="2026-03-11 12:15:47.095110774 +0000 UTC m=+1033.686374741" watchObservedRunningTime="2026-03-11 12:15:47.164067068 +0000 UTC m=+1033.755331025" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.207449 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" podStartSLOduration=7.52338157 podStartE2EDuration="26.207432677s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.771232408 +0000 UTC m=+1010.362496375" lastFinishedPulling="2026-03-11 12:15:42.455283515 +0000 UTC m=+1029.046547482" observedRunningTime="2026-03-11 12:15:47.204679287 +0000 UTC m=+1033.795943244" watchObservedRunningTime="2026-03-11 12:15:47.207432677 +0000 UTC m=+1033.798696644" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.236915 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" podStartSLOduration=9.729260721 podStartE2EDuration="26.236892123s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.344360517 +0000 UTC m=+1009.935624494" lastFinishedPulling="2026-03-11 12:15:39.851991929 +0000 UTC m=+1026.443255896" observedRunningTime="2026-03-11 12:15:47.230548179 +0000 UTC m=+1033.821812146" watchObservedRunningTime="2026-03-11 12:15:47.236892123 +0000 UTC m=+1033.828156090" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.318940 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" podStartSLOduration=9.593496628 podStartE2EDuration="25.318921146s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.126678304 +0000 UTC m=+1010.717942271" lastFinishedPulling="2026-03-11 12:15:39.852102822 +0000 UTC m=+1026.443366789" observedRunningTime="2026-03-11 12:15:47.308909355 +0000 UTC m=+1033.900173322" watchObservedRunningTime="2026-03-11 12:15:47.318921146 +0000 UTC m=+1033.910185113" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.361489 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" podStartSLOduration=3.415124363 podStartE2EDuration="25.361469592s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.188285553 +0000 UTC m=+1010.779549520" lastFinishedPulling="2026-03-11 12:15:46.134630772 +0000 UTC m=+1032.725894749" observedRunningTime="2026-03-11 12:15:47.359538006 +0000 UTC m=+1033.950801973" watchObservedRunningTime="2026-03-11 12:15:47.361469592 +0000 UTC m=+1033.952733559" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.405162 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" podStartSLOduration=7.03112657 podStartE2EDuration="25.405144941s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.14102001 +0000 UTC m=+1010.732283977" lastFinishedPulling="2026-03-11 12:15:42.515038381 +0000 UTC m=+1029.106302348" observedRunningTime="2026-03-11 12:15:47.403315388 +0000 UTC m=+1033.994579345" watchObservedRunningTime="2026-03-11 12:15:47.405144941 +0000 UTC m=+1033.996408908" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.453764 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" podStartSLOduration=7.746052859 podStartE2EDuration="26.453747183s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.747588381 +0000 UTC m=+1010.338852348" lastFinishedPulling="2026-03-11 12:15:42.455282705 +0000 UTC m=+1029.046546672" observedRunningTime="2026-03-11 12:15:47.449889391 +0000 UTC m=+1034.041153358" watchObservedRunningTime="2026-03-11 12:15:47.453747183 +0000 UTC m=+1034.045011150" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.496785 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" podStartSLOduration=21.165345343 podStartE2EDuration="25.496766303s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:41.71468235 +0000 UTC m=+1028.305946317" lastFinishedPulling="2026-03-11 12:15:46.04610331 +0000 UTC m=+1032.637367277" observedRunningTime="2026-03-11 12:15:47.492518649 +0000 UTC m=+1034.083782616" watchObservedRunningTime="2026-03-11 12:15:47.496766303 +0000 UTC m=+1034.088030270" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.541598 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" podStartSLOduration=6.262499231 podStartE2EDuration="25.541583175s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.149792425 +0000 UTC m=+1010.741056382" lastFinishedPulling="2026-03-11 12:15:43.428876359 +0000 UTC m=+1030.020140326" observedRunningTime="2026-03-11 12:15:47.540834153 +0000 UTC m=+1034.132098120" watchObservedRunningTime="2026-03-11 12:15:47.541583175 +0000 UTC m=+1034.132847142" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.644575 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" podStartSLOduration=7.258962549 podStartE2EDuration="25.644556066s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.21707642 +0000 UTC m=+1010.808340387" lastFinishedPulling="2026-03-11 12:15:42.602669937 +0000 UTC m=+1029.193933904" observedRunningTime="2026-03-11 12:15:47.572553425 +0000 UTC m=+1034.163817392" watchObservedRunningTime="2026-03-11 12:15:47.644556066 +0000 UTC m=+1034.235820033" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.671897 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" podStartSLOduration=7.996287689 podStartE2EDuration="26.67187528s" podCreationTimestamp="2026-03-11 12:15:21 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.779801577 +0000 UTC m=+1010.371065544" lastFinishedPulling="2026-03-11 12:15:42.455389168 +0000 UTC m=+1029.046653135" observedRunningTime="2026-03-11 12:15:47.646772241 +0000 UTC m=+1034.238036198" watchObservedRunningTime="2026-03-11 12:15:47.67187528 +0000 UTC m=+1034.263139247" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.675260 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" podStartSLOduration=9.107292824 podStartE2EDuration="25.675235458s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.149422424 +0000 UTC m=+1010.740686391" lastFinishedPulling="2026-03-11 12:15:40.717365058 +0000 UTC m=+1027.308629025" observedRunningTime="2026-03-11 12:15:47.670577962 +0000 UTC m=+1034.261841929" watchObservedRunningTime="2026-03-11 12:15:47.675235458 +0000 UTC m=+1034.266499425" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.701331 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" podStartSLOduration=7.282537434 podStartE2EDuration="25.701315215s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.184244136 +0000 UTC m=+1010.775508103" lastFinishedPulling="2026-03-11 12:15:42.603021917 +0000 UTC m=+1029.194285884" observedRunningTime="2026-03-11 12:15:47.697677699 +0000 UTC m=+1034.288941666" watchObservedRunningTime="2026-03-11 12:15:47.701315215 +0000 UTC m=+1034.292579182" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.760569 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" podStartSLOduration=3.924233864 podStartE2EDuration="25.760549896s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.248584395 +0000 UTC m=+1010.839848362" lastFinishedPulling="2026-03-11 12:15:46.084900437 +0000 UTC m=+1032.676164394" observedRunningTime="2026-03-11 12:15:47.757987551 +0000 UTC m=+1034.349251518" watchObservedRunningTime="2026-03-11 12:15:47.760549896 +0000 UTC m=+1034.351813863" Mar 11 12:15:47 crc kubenswrapper[4816]: I0311 12:15:47.761129 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" podStartSLOduration=7.390835301 podStartE2EDuration="25.761122523s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.232985602 +0000 UTC m=+1010.824249579" lastFinishedPulling="2026-03-11 12:15:42.603272834 +0000 UTC m=+1029.194536801" observedRunningTime="2026-03-11 12:15:47.732183732 +0000 UTC m=+1034.323447689" watchObservedRunningTime="2026-03-11 12:15:47.761122523 +0000 UTC m=+1034.352386490" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.101873 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" event={"ID":"4d4c74ff-52a2-4426-bd06-daa6e9b1a832","Type":"ContainerStarted","Data":"1499fc908d58d80a97b46f8ad0f79220c2cf1db2554d3028ff7b0f181f57d55e"} Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.102610 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.106588 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" event={"ID":"8e810ef6-d3f5-4133-bce2-234df32b3d10","Type":"ContainerStarted","Data":"d53f55a3ba7f10ddf2d38af4360c3e85b1eb5efe7e8538fc71ea168aeadcf553"} Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.120216 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" podStartSLOduration=2.28625565 podStartE2EDuration="30.120189264s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.020210881 +0000 UTC m=+1010.611474858" lastFinishedPulling="2026-03-11 12:15:51.854144495 +0000 UTC m=+1038.445408472" observedRunningTime="2026-03-11 12:15:52.119099612 +0000 UTC m=+1038.710363599" watchObservedRunningTime="2026-03-11 12:15:52.120189264 +0000 UTC m=+1038.711453241" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.145019 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dnqpf" podStartSLOduration=2.506327663 podStartE2EDuration="30.144990265s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.126968902 +0000 UTC m=+1010.718232869" lastFinishedPulling="2026-03-11 12:15:51.765631494 +0000 UTC m=+1038.356895471" observedRunningTime="2026-03-11 12:15:52.138444414 +0000 UTC m=+1038.729708391" watchObservedRunningTime="2026-03-11 12:15:52.144990265 +0000 UTC m=+1038.736254252" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.246993 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rb228" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.252986 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-g8cg2" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.288134 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-fjkn4" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.361094 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-px2wm" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.390529 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-66ctj" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.405564 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-8v46x" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.454614 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-874hd" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.670793 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-wnsst" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.681239 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-bl9hm" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.714975 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rxhkb" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.737995 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rr62t" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.768290 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h7kgb" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.827557 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7ldx8" Mar 11 12:15:52 crc kubenswrapper[4816]: I0311 12:15:52.856519 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-426qz" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.122451 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" event={"ID":"73e00d02-6599-4cab-a32b-8fe96b82951a","Type":"ContainerStarted","Data":"4d2d5d8b030b845e614b1392c2879fe10232fb99a2bd70f5e27a764161adf148"} Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.122650 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.123677 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" event={"ID":"b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5","Type":"ContainerStarted","Data":"214ad0558a154141454764000bc45f7800c0efc1fa8bb9b88e0d9c3486659348"} Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.123848 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.153334 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" podStartSLOduration=2.296183687 podStartE2EDuration="31.153317327s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:23.758104516 +0000 UTC m=+1010.349368483" lastFinishedPulling="2026-03-11 12:15:52.615238156 +0000 UTC m=+1039.206502123" observedRunningTime="2026-03-11 12:15:53.152919626 +0000 UTC m=+1039.744183603" watchObservedRunningTime="2026-03-11 12:15:53.153317327 +0000 UTC m=+1039.744581294" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.159718 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-k2rnj" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.170584 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" podStartSLOduration=2.700443972 podStartE2EDuration="31.170568968s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:24.145888462 +0000 UTC m=+1010.737152429" lastFinishedPulling="2026-03-11 12:15:52.616013448 +0000 UTC m=+1039.207277425" observedRunningTime="2026-03-11 12:15:53.168307383 +0000 UTC m=+1039.759571350" watchObservedRunningTime="2026-03-11 12:15:53.170568968 +0000 UTC m=+1039.761832935" Mar 11 12:15:53 crc kubenswrapper[4816]: I0311 12:15:53.212824 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-kx9nz" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.375743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.395488 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78a7aebd-70a2-4608-a669-aea496cb6186-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l\" (UID: \"78a7aebd-70a2-4608-a669-aea496cb6186\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.622435 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8r4xj" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.641758 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.784535 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.797295 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f4b0b09-5704-432a-9cd4-82a296f3c467-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-pt8n6\" (UID: \"5f4b0b09-5704-432a-9cd4-82a296f3c467\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.840665 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vwbwj" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.843382 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:15:54 crc kubenswrapper[4816]: I0311 12:15:54.992284 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l"] Mar 11 12:15:55 crc kubenswrapper[4816]: I0311 12:15:55.151316 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" event={"ID":"78a7aebd-70a2-4608-a669-aea496cb6186","Type":"ContainerStarted","Data":"ff8e23ab061531f91817bf87163f31937a8930b4b4b91fa3df169233f32e38c0"} Mar 11 12:15:55 crc kubenswrapper[4816]: I0311 12:15:55.170041 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6"] Mar 11 12:15:55 crc kubenswrapper[4816]: W0311 12:15:55.179389 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f4b0b09_5704_432a_9cd4_82a296f3c467.slice/crio-44c4bbb51f6801e313e560f23d15f817744193d1651889d60124f9eac4783cc8 WatchSource:0}: Error finding container 44c4bbb51f6801e313e560f23d15f817744193d1651889d60124f9eac4783cc8: Status 404 returned error can't find the container with id 44c4bbb51f6801e313e560f23d15f817744193d1651889d60124f9eac4783cc8 Mar 11 12:15:56 crc kubenswrapper[4816]: I0311 12:15:56.159101 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" event={"ID":"5f4b0b09-5704-432a-9cd4-82a296f3c467","Type":"ContainerStarted","Data":"44c4bbb51f6801e313e560f23d15f817744193d1651889d60124f9eac4783cc8"} Mar 11 12:15:58 crc kubenswrapper[4816]: I0311 12:15:58.339479 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-hzd9q" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.143214 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.144474 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.152850 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.153774 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.154068 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.154417 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.165659 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") pod \"auto-csr-approver-29553856-7k69r\" (UID: \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\") " pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.266435 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") pod \"auto-csr-approver-29553856-7k69r\" (UID: \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\") " pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.285075 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") pod \"auto-csr-approver-29553856-7k69r\" (UID: \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\") " pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.462992 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:00 crc kubenswrapper[4816]: I0311 12:16:00.933942 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:16:00 crc kubenswrapper[4816]: W0311 12:16:00.941667 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79fb6b17_9d8a_4f10_8a93_a3e65f470a27.slice/crio-7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02 WatchSource:0}: Error finding container 7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02: Status 404 returned error can't find the container with id 7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02 Mar 11 12:16:01 crc kubenswrapper[4816]: I0311 12:16:01.205180 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553856-7k69r" event={"ID":"79fb6b17-9d8a-4f10-8a93-a3e65f470a27","Type":"ContainerStarted","Data":"7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02"} Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.214053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" event={"ID":"5f4b0b09-5704-432a-9cd4-82a296f3c467","Type":"ContainerStarted","Data":"dbe7e8da8b68665fef8578c00790d8dbd642f082ba9bdeeefe39c1b2da581690"} Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.214468 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.252554 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" podStartSLOduration=40.252524851 podStartE2EDuration="40.252524851s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:16:02.251962794 +0000 UTC m=+1048.843226771" watchObservedRunningTime="2026-03-11 12:16:02.252524851 +0000 UTC m=+1048.843788858" Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.464317 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-zczdq" Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.539792 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h2vmc" Mar 11 12:16:02 crc kubenswrapper[4816]: I0311 12:16:02.694449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-56fsw" Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.250367 4816 generic.go:334] "Generic (PLEG): container finished" podID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" containerID="5d6df61e0b509a66b3346da65b74fba3a74851e8e005a57c5d0fba5a7957a438" exitCode=0 Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.250459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553856-7k69r" event={"ID":"79fb6b17-9d8a-4f10-8a93-a3e65f470a27","Type":"ContainerDied","Data":"5d6df61e0b509a66b3346da65b74fba3a74851e8e005a57c5d0fba5a7957a438"} Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.253495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" event={"ID":"78a7aebd-70a2-4608-a669-aea496cb6186","Type":"ContainerStarted","Data":"d47df04afa1e477a610ffc4e9a31784b39656f4d4e8075507e9d6673b54137b9"} Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.253593 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:16:07 crc kubenswrapper[4816]: I0311 12:16:07.302677 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" podStartSLOduration=33.910327758 podStartE2EDuration="45.302659949s" podCreationTimestamp="2026-03-11 12:15:22 +0000 UTC" firstStartedPulling="2026-03-11 12:15:55.002824565 +0000 UTC m=+1041.594088532" lastFinishedPulling="2026-03-11 12:16:06.395156756 +0000 UTC m=+1052.986420723" observedRunningTime="2026-03-11 12:16:07.299046034 +0000 UTC m=+1053.890310011" watchObservedRunningTime="2026-03-11 12:16:07.302659949 +0000 UTC m=+1053.893923926" Mar 11 12:16:08 crc kubenswrapper[4816]: I0311 12:16:08.608817 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:08 crc kubenswrapper[4816]: I0311 12:16:08.710476 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") pod \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\" (UID: \"79fb6b17-9d8a-4f10-8a93-a3e65f470a27\") " Mar 11 12:16:08 crc kubenswrapper[4816]: I0311 12:16:08.716367 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch" (OuterVolumeSpecName: "kube-api-access-z5dch") pod "79fb6b17-9d8a-4f10-8a93-a3e65f470a27" (UID: "79fb6b17-9d8a-4f10-8a93-a3e65f470a27"). InnerVolumeSpecName "kube-api-access-z5dch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:16:08 crc kubenswrapper[4816]: I0311 12:16:08.811771 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5dch\" (UniqueName: \"kubernetes.io/projected/79fb6b17-9d8a-4f10-8a93-a3e65f470a27-kube-api-access-z5dch\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.273499 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553856-7k69r" event={"ID":"79fb6b17-9d8a-4f10-8a93-a3e65f470a27","Type":"ContainerDied","Data":"7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02"} Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.273558 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e3f4449a02d0a67344714e7731e0eb869633b7a8599cad40fa72cf10c713c02" Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.273586 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553856-7k69r" Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.738319 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:16:09 crc kubenswrapper[4816]: I0311 12:16:09.744044 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553850-v7tlf"] Mar 11 12:16:10 crc kubenswrapper[4816]: I0311 12:16:10.140013 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac620ec-72d5-4603-852f-8ba3f1ad0e9b" path="/var/lib/kubelet/pods/5ac620ec-72d5-4603-852f-8ba3f1ad0e9b/volumes" Mar 11 12:16:14 crc kubenswrapper[4816]: I0311 12:16:14.654922 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l" Mar 11 12:16:14 crc kubenswrapper[4816]: I0311 12:16:14.848904 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-pt8n6" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.218575 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:31 crc kubenswrapper[4816]: E0311 12:16:31.219370 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" containerName="oc" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.219386 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" containerName="oc" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.219533 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" containerName="oc" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.220221 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.223658 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.223880 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.225050 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-sh8cd" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.225053 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.242166 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.301463 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.305118 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.307217 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.313743 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.400835 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.400901 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502198 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502260 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502298 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.502343 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.503171 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.529149 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") pod \"dnsmasq-dns-5448ff6dc7-dbfn9\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.537763 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.603288 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.603685 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.603722 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.604422 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.604445 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.655303 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") pod \"dnsmasq-dns-64696987c5-bkgpq\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.831755 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.839938 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:16:31 crc kubenswrapper[4816]: I0311 12:16:31.918347 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:32 crc kubenswrapper[4816]: I0311 12:16:32.192367 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:32 crc kubenswrapper[4816]: I0311 12:16:32.460640 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" event={"ID":"0f513c34-8707-46dd-9b55-e953666df46c","Type":"ContainerStarted","Data":"34349f2681e98adca418f57aa55e4bcf6f5a91d14ddfb746d02fa6d79fb45869"} Mar 11 12:16:32 crc kubenswrapper[4816]: I0311 12:16:32.461436 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" event={"ID":"e986d513-8aa0-4908-b200-d6212f56cd0f","Type":"ContainerStarted","Data":"6c939d71a23fbd96f7ac8915514c1d72476d3ae7287584fbe750ce02fb1ef302"} Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.329199 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.359107 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.360829 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.370036 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.482256 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.482325 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.482358 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.585760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.585825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.585859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.587109 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.587211 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.629633 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") pod \"dnsmasq-dns-658f55c9f5-wc7mw\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.679746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.777613 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.809590 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.811244 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.824400 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.828843 4816 scope.go:117] "RemoveContainer" containerID="c3ad155fc5f3f7204d5fb77b61c79c6603bc6f42436d74dbc3171b2dbf21bbd2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.892823 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.892902 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.892951 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.994331 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.994389 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.994427 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.995378 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:34 crc kubenswrapper[4816]: I0311 12:16:34.995392 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.014808 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") pod \"dnsmasq-dns-54b5dffb47-ngbb2\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.137968 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.365788 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:16:35 crc kubenswrapper[4816]: W0311 12:16:35.367239 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80db2c12_e3f3_4f0e_8201_435f1a0b27c5.slice/crio-d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37 WatchSource:0}: Error finding container d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37: Status 404 returned error can't find the container with id d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37 Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.496292 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.497672 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.504945 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.506870 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507047 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507261 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507433 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507594 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-78f5m" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.507739 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.526171 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.529572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" event={"ID":"80db2c12-e3f3-4f0e-8201-435f1a0b27c5","Type":"ContainerStarted","Data":"d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37"} Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618080 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618155 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618210 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.618237 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623451 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623598 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623667 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623714 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623749 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.623789 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.687733 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:16:35 crc kubenswrapper[4816]: W0311 12:16:35.712313 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe27f2c_fcd4_42f9_8d14_9ad29dbf86b5.slice/crio-c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82 WatchSource:0}: Error finding container c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82: Status 404 returned error can't find the container with id c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82 Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725681 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725715 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725742 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725782 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725851 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.725901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.726142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.726168 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.726187 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.726207 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.728012 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.728021 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.728356 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.729992 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.732233 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.734347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.734364 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.735212 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.734367 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.739899 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.743903 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.751248 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.823301 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.922392 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.924381 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.943905 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974394 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974410 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974412 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974715 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974956 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.974972 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-hgw2n" Mar 11 12:16:35 crc kubenswrapper[4816]: I0311 12:16:35.975409 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031293 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031392 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031426 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031452 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031569 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031712 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031839 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.031982 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.032037 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.032148 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.134611 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.134793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136651 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136706 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136793 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136830 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136878 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.136964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.137009 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.137038 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.137068 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.137715 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.138388 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.138800 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.139574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.146557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.147850 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.148329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.148385 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.158969 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.159771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.163677 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.165348 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.305819 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.544822 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerStarted","Data":"c73f7e4d7f0f4588b80903c0c3810420cc3aeed26ba2c6224b092ad58bda611c"} Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.546175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerStarted","Data":"c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82"} Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.576737 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.578628 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.587159 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.588112 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bvh4z" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.588420 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.588932 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.590669 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.592934 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.647954 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648039 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648083 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648125 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648159 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648180 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648205 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.648247 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750258 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750336 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750469 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750508 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750534 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750563 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.750614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.751679 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.755245 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.756557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.756896 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.756973 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.757111 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.759044 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.805974 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.874007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " pod="openstack/openstack-galera-0" Mar 11 12:16:36 crc kubenswrapper[4816]: I0311 12:16:36.913192 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.112369 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.113993 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.116150 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-n5gxr" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.120277 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.120986 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.121012 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.175823 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.186854 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.186909 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.186951 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.186973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.187001 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.187028 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.187067 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.187113 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.285321 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.286380 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.293959 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.293980 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-v9fqr" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.294225 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296697 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296825 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296943 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.296996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.297022 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.297051 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.297070 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.298991 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.301243 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.301537 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.302164 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.311936 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.319228 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.326652 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.330723 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.338487 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.357911 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398533 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398671 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398786 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398858 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.398882 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.455801 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500680 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500761 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500792 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.500934 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.505472 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.505769 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.518845 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.519007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.521664 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") pod \"memcached-0\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " pod="openstack/memcached-0" Mar 11 12:16:38 crc kubenswrapper[4816]: I0311 12:16:38.691417 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.365546 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.367030 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.371269 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-h868s" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.374565 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.436059 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") pod \"kube-state-metrics-0\" (UID: \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\") " pod="openstack/kube-state-metrics-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.538567 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") pod \"kube-state-metrics-0\" (UID: \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\") " pod="openstack/kube-state-metrics-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.561214 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") pod \"kube-state-metrics-0\" (UID: \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\") " pod="openstack/kube-state-metrics-0" Mar 11 12:16:40 crc kubenswrapper[4816]: I0311 12:16:40.717527 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.880582 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.882098 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.885135 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.885224 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7f7bt" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.885720 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.896470 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.943495 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.945009 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:43 crc kubenswrapper[4816]: I0311 12:16:43.956934 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025301 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025341 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025362 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025398 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025424 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025554 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025576 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025606 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025637 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025655 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025691 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025721 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.025742 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.127157 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.127209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.127645 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.127701 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.130393 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134546 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134701 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134741 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134879 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134906 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134948 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.134971 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.135045 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.135156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.135240 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.135559 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.141537 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.145187 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.147092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.147423 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.149454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.149728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.164071 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") pod \"ovn-controller-ovs-tnhfq\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.261918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") pod \"ovn-controller-84rn8\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.321735 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.512785 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.724431 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.728556 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.732009 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.733692 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.733834 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.733956 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xckrg" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.739066 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.745196 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874466 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874534 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874604 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874678 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874736 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874768 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874796 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.874823 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977399 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977466 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977501 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977543 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977574 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977598 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977621 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.977642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.978020 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.978730 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.979221 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.979873 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:44 crc kubenswrapper[4816]: I0311 12:16:44.987089 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:44.992857 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:44.993794 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:44.999122 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:45.009488 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:45 crc kubenswrapper[4816]: I0311 12:16:45.056832 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.418824 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.421392 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.423539 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.424479 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.424761 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.432204 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zs9k7" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.436965 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536463 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536497 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536538 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536613 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.536638 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.537849 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.639920 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.639986 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640033 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640080 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640178 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640224 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.640291 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.642086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.642402 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.642429 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.642475 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.649340 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.654492 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.672452 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.676687 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.685440 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:47 crc kubenswrapper[4816]: I0311 12:16:47.795171 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 12:16:51 crc kubenswrapper[4816]: I0311 12:16:51.068948 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.670761 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.671580 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxrph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-dbfn9_openstack(0f513c34-8707-46dd-9b55-e953666df46c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.672898 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" podUID="0f513c34-8707-46dd-9b55-e953666df46c" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.676521 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.676765 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p56ws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-bkgpq_openstack(e986d513-8aa0-4908-b200-d6212f56cd0f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.678535 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" podUID="e986d513-8aa0-4908-b200-d6212f56cd0f" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.686465 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.686663 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfcps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-wc7mw_openstack(80db2c12-e3f3-4f0e-8201-435f1a0b27c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:16:55 crc kubenswrapper[4816]: E0311 12:16:55.688156 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" podUID="80db2c12-e3f3-4f0e-8201-435f1a0b27c5" Mar 11 12:16:55 crc kubenswrapper[4816]: I0311 12:16:55.772113 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e9e4e8b-b60c-4c37-974a-8bdc1b243135","Type":"ContainerStarted","Data":"e914685ae7eb058c653bc79edb98cb710a39f5ce6911740300b8ce8933b04af8"} Mar 11 12:16:56 crc kubenswrapper[4816]: E0311 12:16:56.124698 4816 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 11 12:16:56 crc kubenswrapper[4816]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/80db2c12-e3f3-4f0e-8201-435f1a0b27c5/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 11 12:16:56 crc kubenswrapper[4816]: > podSandboxID="d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37" Mar 11 12:16:56 crc kubenswrapper[4816]: E0311 12:16:56.125153 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:16:56 crc kubenswrapper[4816]: init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfcps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-wc7mw_openstack(80db2c12-e3f3-4f0e-8201-435f1a0b27c5): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/80db2c12-e3f3-4f0e-8201-435f1a0b27c5/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 11 12:16:56 crc kubenswrapper[4816]: > logger="UnhandledError" Mar 11 12:16:56 crc kubenswrapper[4816]: E0311 12:16:56.126319 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/80db2c12-e3f3-4f0e-8201-435f1a0b27c5/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" podUID="80db2c12-e3f3-4f0e-8201-435f1a0b27c5" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.368954 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.379902 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.530888 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.537928 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") pod \"0f513c34-8707-46dd-9b55-e953666df46c\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.538150 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") pod \"e986d513-8aa0-4908-b200-d6212f56cd0f\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.539426 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e986d513-8aa0-4908-b200-d6212f56cd0f" (UID: "e986d513-8aa0-4908-b200-d6212f56cd0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.539534 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") pod \"e986d513-8aa0-4908-b200-d6212f56cd0f\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.539565 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") pod \"0f513c34-8707-46dd-9b55-e953666df46c\" (UID: \"0f513c34-8707-46dd-9b55-e953666df46c\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.540725 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") pod \"e986d513-8aa0-4908-b200-d6212f56cd0f\" (UID: \"e986d513-8aa0-4908-b200-d6212f56cd0f\") " Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.540063 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config" (OuterVolumeSpecName: "config") pod "e986d513-8aa0-4908-b200-d6212f56cd0f" (UID: "e986d513-8aa0-4908-b200-d6212f56cd0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.540520 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config" (OuterVolumeSpecName: "config") pod "0f513c34-8707-46dd-9b55-e953666df46c" (UID: "0f513c34-8707-46dd-9b55-e953666df46c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.541536 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.549743 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph" (OuterVolumeSpecName: "kube-api-access-sxrph") pod "0f513c34-8707-46dd-9b55-e953666df46c" (UID: "0f513c34-8707-46dd-9b55-e953666df46c"). InnerVolumeSpecName "kube-api-access-sxrph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.552711 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws" (OuterVolumeSpecName: "kube-api-access-p56ws") pod "e986d513-8aa0-4908-b200-d6212f56cd0f" (UID: "e986d513-8aa0-4908-b200-d6212f56cd0f"). InnerVolumeSpecName "kube-api-access-p56ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.571001 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.579613 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.604914 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.644069 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p56ws\" (UniqueName: \"kubernetes.io/projected/e986d513-8aa0-4908-b200-d6212f56cd0f-kube-api-access-p56ws\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.644105 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxrph\" (UniqueName: \"kubernetes.io/projected/0f513c34-8707-46dd-9b55-e953666df46c-kube-api-access-sxrph\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.644119 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e986d513-8aa0-4908-b200-d6212f56cd0f-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.644132 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f513c34-8707-46dd-9b55-e953666df46c-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.669720 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.732386 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.790524 4816 generic.go:334] "Generic (PLEG): container finished" podID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerID="c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e" exitCode=0 Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.790629 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerDied","Data":"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e"} Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.791864 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerStarted","Data":"ef8afb38cbe161f1b81f860d56715a732c9c137776bc40df909c84b5acbd4154"} Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.793813 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" event={"ID":"0f513c34-8707-46dd-9b55-e953666df46c","Type":"ContainerDied","Data":"34349f2681e98adca418f57aa55e4bcf6f5a91d14ddfb746d02fa6d79fb45869"} Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.793978 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-dbfn9" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.797630 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" event={"ID":"e986d513-8aa0-4908-b200-d6212f56cd0f","Type":"ContainerDied","Data":"6c939d71a23fbd96f7ac8915514c1d72476d3ae7287584fbe750ce02fb1ef302"} Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.798000 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-bkgpq" Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.807528 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.874995 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.892450 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-bkgpq"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.908963 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:56 crc kubenswrapper[4816]: I0311 12:16:56.915225 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-dbfn9"] Mar 11 12:16:57 crc kubenswrapper[4816]: W0311 12:16:57.063442 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5030028c_f574_4334_a837_2430761524b4.slice/crio-6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731 WatchSource:0}: Error finding container 6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731: Status 404 returned error can't find the container with id 6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731 Mar 11 12:16:57 crc kubenswrapper[4816]: W0311 12:16:57.072162 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26aea2df_f497_478d_b953_060189ef2569.slice/crio-bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056 WatchSource:0}: Error finding container bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056: Status 404 returned error can't find the container with id bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056 Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.701843 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.808573 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8" event={"ID":"2de58390-335b-40cc-8461-d931d3b22e41","Type":"ContainerStarted","Data":"90ffa1dacc5321713c5d44a9d616add617a25ab1efffcadfb14af28f07cc7bbd"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.810307 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerStarted","Data":"c1304c6acbe0151fcfd1f27a9fb0f616c29bb18a4876bb3def66924a603536ea"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.811778 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerStarted","Data":"33dcb516fa17b7c432ef1e2b1650ba4d2e9f946dd76257f934af302a386a7dbf"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.812950 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerStarted","Data":"bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.814157 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5030028c-f574-4334-a837-2430761524b4","Type":"ContainerStarted","Data":"6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.815948 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerStarted","Data":"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e"} Mar 11 12:16:57 crc kubenswrapper[4816]: I0311 12:16:57.817137 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerStarted","Data":"856ecaff8a78617160b7f62ce0d1169e3c52ef425eb093d777cccb4f585957a7"} Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.143663 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f513c34-8707-46dd-9b55-e953666df46c" path="/var/lib/kubelet/pods/0f513c34-8707-46dd-9b55-e953666df46c/volumes" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.144039 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e986d513-8aa0-4908-b200-d6212f56cd0f" path="/var/lib/kubelet/pods/e986d513-8aa0-4908-b200-d6212f56cd0f/volumes" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.828567 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerStarted","Data":"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd"} Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.829106 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.833833 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerStarted","Data":"22c727583d6de2eec899c37134713c754f06d9d2f697ad226095e328238d230b"} Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.836400 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e9e4e8b-b60c-4c37-974a-8bdc1b243135","Type":"ContainerStarted","Data":"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b"} Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.836693 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.850448 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" podStartSLOduration=4.809305165 podStartE2EDuration="24.850429136s" podCreationTimestamp="2026-03-11 12:16:34 +0000 UTC" firstStartedPulling="2026-03-11 12:16:35.714738437 +0000 UTC m=+1082.306002404" lastFinishedPulling="2026-03-11 12:16:55.755862408 +0000 UTC m=+1102.347126375" observedRunningTime="2026-03-11 12:16:58.845455392 +0000 UTC m=+1105.436719359" watchObservedRunningTime="2026-03-11 12:16:58.850429136 +0000 UTC m=+1105.441693093" Mar 11 12:16:58 crc kubenswrapper[4816]: I0311 12:16:58.871674 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.483705633 podStartE2EDuration="18.871639333s" podCreationTimestamp="2026-03-11 12:16:40 +0000 UTC" firstStartedPulling="2026-03-11 12:16:55.674101673 +0000 UTC m=+1102.265365640" lastFinishedPulling="2026-03-11 12:16:58.062035373 +0000 UTC m=+1104.653299340" observedRunningTime="2026-03-11 12:16:58.86776016 +0000 UTC m=+1105.459024127" watchObservedRunningTime="2026-03-11 12:16:58.871639333 +0000 UTC m=+1105.462903320" Mar 11 12:16:59 crc kubenswrapper[4816]: I0311 12:16:59.846466 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerStarted","Data":"47287b2bd213321105c729d451b069f02c0e309af3b5c9c84b7b9c24acc1a5f3"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.882192 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerStarted","Data":"90224f5e31cd4408489a5dec30ffa77147f611b179c23e40a3d0104504542a1b"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.886015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerStarted","Data":"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.888990 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerStarted","Data":"e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.891583 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5030028c-f574-4334-a837-2430761524b4","Type":"ContainerStarted","Data":"0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.891741 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.895517 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerStarted","Data":"ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889"} Mar 11 12:17:03 crc kubenswrapper[4816]: I0311 12:17:03.952833 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.521795739 podStartE2EDuration="25.952808831s" podCreationTimestamp="2026-03-11 12:16:38 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.069613513 +0000 UTC m=+1103.660877490" lastFinishedPulling="2026-03-11 12:17:03.500626615 +0000 UTC m=+1110.091890582" observedRunningTime="2026-03-11 12:17:03.950599567 +0000 UTC m=+1110.541863554" watchObservedRunningTime="2026-03-11 12:17:03.952808831 +0000 UTC m=+1110.544072798" Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.922688 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8" event={"ID":"2de58390-335b-40cc-8461-d931d3b22e41","Type":"ContainerStarted","Data":"b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17"} Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.924789 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-84rn8" Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.929668 4816 generic.go:334] "Generic (PLEG): container finished" podID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerID="ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b" exitCode=0 Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.930001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerDied","Data":"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b"} Mar 11 12:17:04 crc kubenswrapper[4816]: I0311 12:17:04.963645 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-84rn8" podStartSLOduration=15.483000472 podStartE2EDuration="21.963624075s" podCreationTimestamp="2026-03-11 12:16:43 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.072503507 +0000 UTC m=+1103.663767484" lastFinishedPulling="2026-03-11 12:17:03.55312708 +0000 UTC m=+1110.144391087" observedRunningTime="2026-03-11 12:17:04.943571443 +0000 UTC m=+1111.534835410" watchObservedRunningTime="2026-03-11 12:17:04.963624075 +0000 UTC m=+1111.554888042" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.140429 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.213272 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.646277 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.750424 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") pod \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.750597 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") pod \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.750662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") pod \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\" (UID: \"80db2c12-e3f3-4f0e-8201-435f1a0b27c5\") " Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.757906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps" (OuterVolumeSpecName: "kube-api-access-bfcps") pod "80db2c12-e3f3-4f0e-8201-435f1a0b27c5" (UID: "80db2c12-e3f3-4f0e-8201-435f1a0b27c5"). InnerVolumeSpecName "kube-api-access-bfcps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.773973 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80db2c12-e3f3-4f0e-8201-435f1a0b27c5" (UID: "80db2c12-e3f3-4f0e-8201-435f1a0b27c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.777471 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config" (OuterVolumeSpecName: "config") pod "80db2c12-e3f3-4f0e-8201-435f1a0b27c5" (UID: "80db2c12-e3f3-4f0e-8201-435f1a0b27c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.852494 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfcps\" (UniqueName: \"kubernetes.io/projected/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-kube-api-access-bfcps\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.852540 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.852550 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80db2c12-e3f3-4f0e-8201-435f1a0b27c5-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.950298 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerStarted","Data":"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005"} Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.950352 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerStarted","Data":"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c"} Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.951950 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.951993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" event={"ID":"80db2c12-e3f3-4f0e-8201-435f1a0b27c5","Type":"ContainerDied","Data":"d6fb7e37a836f700eb5acfa41a76a70f710c794efd150ca6ee3fb1323c24aa37"} Mar 11 12:17:05 crc kubenswrapper[4816]: I0311 12:17:05.952057 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-wc7mw" Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:05.997403 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tnhfq" podStartSLOduration=17.469383217 podStartE2EDuration="22.997381636s" podCreationTimestamp="2026-03-11 12:16:43 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.986475308 +0000 UTC m=+1104.577739315" lastFinishedPulling="2026-03-11 12:17:03.514473767 +0000 UTC m=+1110.105737734" observedRunningTime="2026-03-11 12:17:05.972014699 +0000 UTC m=+1112.563278686" watchObservedRunningTime="2026-03-11 12:17:05.997381636 +0000 UTC m=+1112.588645603" Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:06.029192 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:06.036153 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-wc7mw"] Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:06.143146 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80db2c12-e3f3-4f0e-8201-435f1a0b27c5" path="/var/lib/kubelet/pods/80db2c12-e3f3-4f0e-8201-435f1a0b27c5/volumes" Mar 11 12:17:06 crc kubenswrapper[4816]: I0311 12:17:06.972624 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.471214 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.472172 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.474574 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.503403 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.595147 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.596634 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.596775 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.596807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.598969 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.599054 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.612882 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.625950 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.632754 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.652561 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702055 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702147 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702185 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702267 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702296 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702318 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702343 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.702366 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.707177 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.707521 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.707589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.717420 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.725706 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.735652 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") pod \"ovn-controller-metrics-r8xbm\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.798608 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805040 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805111 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805154 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805213 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.805450 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.806218 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.806712 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.807377 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: E0311 12:17:07.807845 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-26c46 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" podUID="bc9d174a-14aa-42e4-bfc0-3b085e725504" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.841007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") pod \"dnsmasq-dns-6f9f59f7c5-fs99q\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.849281 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.850605 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.859741 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.870702 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.975019 4816 generic.go:334] "Generic (PLEG): container finished" podID="da177cde-6332-4562-809a-d4bee453cebf" containerID="933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03" exitCode=0 Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.975106 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerDied","Data":"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03"} Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.980589 4816 generic.go:334] "Generic (PLEG): container finished" podID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerID="90224f5e31cd4408489a5dec30ffa77147f611b179c23e40a3d0104504542a1b" exitCode=0 Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.980664 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.980641 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerDied","Data":"90224f5e31cd4408489a5dec30ffa77147f611b179c23e40a3d0104504542a1b"} Mar 11 12:17:07 crc kubenswrapper[4816]: I0311 12:17:07.990206 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.013703 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.013766 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.013823 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.014035 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.014238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.115660 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") pod \"bc9d174a-14aa-42e4-bfc0-3b085e725504\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.115837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") pod \"bc9d174a-14aa-42e4-bfc0-3b085e725504\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.115941 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") pod \"bc9d174a-14aa-42e4-bfc0-3b085e725504\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.115993 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") pod \"bc9d174a-14aa-42e4-bfc0-3b085e725504\" (UID: \"bc9d174a-14aa-42e4-bfc0-3b085e725504\") " Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116348 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116503 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc9d174a-14aa-42e4-bfc0-3b085e725504" (UID: "bc9d174a-14aa-42e4-bfc0-3b085e725504"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116556 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config" (OuterVolumeSpecName: "config") pod "bc9d174a-14aa-42e4-bfc0-3b085e725504" (UID: "bc9d174a-14aa-42e4-bfc0-3b085e725504"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.116901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117043 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117176 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117410 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117505 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc9d174a-14aa-42e4-bfc0-3b085e725504" (UID: "bc9d174a-14aa-42e4-bfc0-3b085e725504"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.117822 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.118017 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.118674 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.130843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46" (OuterVolumeSpecName: "kube-api-access-26c46") pod "bc9d174a-14aa-42e4-bfc0-3b085e725504" (UID: "bc9d174a-14aa-42e4-bfc0-3b085e725504"). InnerVolumeSpecName "kube-api-access-26c46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.135527 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") pod \"dnsmasq-dns-5d944d7b75-r4jqj\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.197120 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.218641 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.218672 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.218681 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26c46\" (UniqueName: \"kubernetes.io/projected/bc9d174a-14aa-42e4-bfc0-3b085e725504-kube-api-access-26c46\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.218690 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9d174a-14aa-42e4-bfc0-3b085e725504-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.696649 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 11 12:17:08 crc kubenswrapper[4816]: I0311 12:17:08.991308 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-fs99q" Mar 11 12:17:09 crc kubenswrapper[4816]: I0311 12:17:09.038860 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:09 crc kubenswrapper[4816]: I0311 12:17:09.044386 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-fs99q"] Mar 11 12:17:09 crc kubenswrapper[4816]: I0311 12:17:09.676527 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:09 crc kubenswrapper[4816]: W0311 12:17:09.685693 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3535eec4_3c32_4498_9c38_fbb7a5c77ee8.slice/crio-4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5 WatchSource:0}: Error finding container 4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5: Status 404 returned error can't find the container with id 4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5 Mar 11 12:17:09 crc kubenswrapper[4816]: I0311 12:17:09.940492 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:17:09 crc kubenswrapper[4816]: W0311 12:17:09.948589 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91cdfd54_2ee7_490e_bf3f_563406e59cda.slice/crio-f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a WatchSource:0}: Error finding container f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a: Status 404 returned error can't find the container with id f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.002350 4816 generic.go:334] "Generic (PLEG): container finished" podID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerID="a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922" exitCode=0 Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.002449 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerDied","Data":"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.002902 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerStarted","Data":"4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.005848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerStarted","Data":"4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.008585 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerStarted","Data":"08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.010982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerStarted","Data":"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.012910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8xbm" event={"ID":"91cdfd54-2ee7-490e-bf3f-563406e59cda","Type":"ContainerStarted","Data":"f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.015095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerStarted","Data":"5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b"} Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.059581 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.066173 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.786248141 podStartE2EDuration="24.066136405s" podCreationTimestamp="2026-03-11 12:16:46 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.094893457 +0000 UTC m=+1103.686157424" lastFinishedPulling="2026-03-11 12:17:09.374781721 +0000 UTC m=+1115.966045688" observedRunningTime="2026-03-11 12:17:10.052435417 +0000 UTC m=+1116.643699384" watchObservedRunningTime="2026-03-11 12:17:10.066136405 +0000 UTC m=+1116.657400372" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.083045 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.839052235 podStartE2EDuration="27.083019656s" podCreationTimestamp="2026-03-11 12:16:43 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.073695521 +0000 UTC m=+1103.664959498" lastFinishedPulling="2026-03-11 12:17:09.317662952 +0000 UTC m=+1115.908926919" observedRunningTime="2026-03-11 12:17:10.078625328 +0000 UTC m=+1116.669889295" watchObservedRunningTime="2026-03-11 12:17:10.083019656 +0000 UTC m=+1116.674283623" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.113894 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.666554956 podStartE2EDuration="35.113870742s" podCreationTimestamp="2026-03-11 12:16:35 +0000 UTC" firstStartedPulling="2026-03-11 12:16:57.10876452 +0000 UTC m=+1103.700028487" lastFinishedPulling="2026-03-11 12:17:03.556080306 +0000 UTC m=+1110.147344273" observedRunningTime="2026-03-11 12:17:10.109958398 +0000 UTC m=+1116.701222365" watchObservedRunningTime="2026-03-11 12:17:10.113870742 +0000 UTC m=+1116.705134709" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.146130 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.236173883 podStartE2EDuration="33.146105498s" podCreationTimestamp="2026-03-11 12:16:37 +0000 UTC" firstStartedPulling="2026-03-11 12:16:56.644024669 +0000 UTC m=+1103.235288636" lastFinishedPulling="2026-03-11 12:17:03.553956284 +0000 UTC m=+1110.145220251" observedRunningTime="2026-03-11 12:17:10.142679159 +0000 UTC m=+1116.733943126" watchObservedRunningTime="2026-03-11 12:17:10.146105498 +0000 UTC m=+1116.737369465" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.153294 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9d174a-14aa-42e4-bfc0-3b085e725504" path="/var/lib/kubelet/pods/bc9d174a-14aa-42e4-bfc0-3b085e725504/volumes" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.717242 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.749333 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.750012 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.751432 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.774832 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.793546 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.793615 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.793671 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.795200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.795271 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896878 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896942 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896969 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.896996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.900043 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.900970 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.901094 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.901102 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:10 crc kubenswrapper[4816]: I0311 12:17:10.920778 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") pod \"dnsmasq-dns-7b9fd7d84c-wqn2t\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.029418 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8xbm" event={"ID":"91cdfd54-2ee7-490e-bf3f-563406e59cda","Type":"ContainerStarted","Data":"be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8"} Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.034357 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerStarted","Data":"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca"} Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.034583 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.086927 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-r8xbm" podStartSLOduration=4.086906268 podStartE2EDuration="4.086906268s" podCreationTimestamp="2026-03-11 12:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:11.063074346 +0000 UTC m=+1117.654338313" watchObservedRunningTime="2026-03-11 12:17:11.086906268 +0000 UTC m=+1117.678170255" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.089601 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" podStartSLOduration=4.089585536 podStartE2EDuration="4.089585536s" podCreationTimestamp="2026-03-11 12:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:11.086158926 +0000 UTC m=+1117.677422893" watchObservedRunningTime="2026-03-11 12:17:11.089585536 +0000 UTC m=+1117.680849503" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.098938 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:11 crc kubenswrapper[4816]: W0311 12:17:11.585150 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcc1a78b_c3d2_4c15_81a0_0431da953e51.slice/crio-ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b WatchSource:0}: Error finding container ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b: Status 404 returned error can't find the container with id ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.589058 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.785447 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.819152 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.821880 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.829922 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.830426 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.831442 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-w2jgz" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.833416 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.848989 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.880499 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.915869 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.915934 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.916014 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.916074 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.916093 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:11 crc kubenswrapper[4816]: I0311 12:17:11.916123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.017938 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018373 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018565 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.018723 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.018758 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018775 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.018901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.019000 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.019371 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.019606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.019729 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:12.519698906 +0000 UTC m=+1119.110962873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.028165 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.037191 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.044387 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.044627 4816 generic.go:334] "Generic (PLEG): container finished" podID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerID="035e208fb3e5fc9b968f1db57d46e9bd63d57178d448cbc27d1282a58427f605" exitCode=0 Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.044727 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerDied","Data":"035e208fb3e5fc9b968f1db57d46e9bd63d57178d448cbc27d1282a58427f605"} Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.044796 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerStarted","Data":"ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b"} Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.045360 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="dnsmasq-dns" containerID="cri-o://38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" gracePeriod=10 Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.045919 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.059429 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.110699 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.114590 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.256899 4816 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.94:43420->38.102.83.94:46473: write tcp 38.102.83.94:43420->38.102.83.94:46473: write: broken pipe Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.371374 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.372818 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.376849 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.377196 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.377389 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.398923 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527499 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527569 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527613 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527667 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527690 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527738 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527761 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.527785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.528000 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.528017 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: E0311 12:17:12.528058 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:13.528043514 +0000 UTC m=+1120.119307481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.617735 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.630854 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.630911 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.630981 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631052 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631097 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631142 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631165 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.631943 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.632430 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.632949 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.641567 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.646119 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.657535 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.665367 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") pod \"swift-ring-rebalance-9nggr\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.709783 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732363 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732501 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732688 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.732730 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") pod \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\" (UID: \"3535eec4-3c32-4498-9c38-fbb7a5c77ee8\") " Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.741566 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx" (OuterVolumeSpecName: "kube-api-access-9npfx") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "kube-api-access-9npfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.826526 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.828557 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.829596 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.835201 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.835236 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.835279 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.835297 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9npfx\" (UniqueName: \"kubernetes.io/projected/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-kube-api-access-9npfx\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.850081 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config" (OuterVolumeSpecName: "config") pod "3535eec4-3c32-4498-9c38-fbb7a5c77ee8" (UID: "3535eec4-3c32-4498-9c38-fbb7a5c77ee8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:12 crc kubenswrapper[4816]: I0311 12:17:12.937434 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3535eec4-3c32-4498-9c38-fbb7a5c77ee8-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.059125 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerStarted","Data":"f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7"} Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.060516 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.064294 4816 generic.go:334] "Generic (PLEG): container finished" podID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerID="38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" exitCode=0 Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.064349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerDied","Data":"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca"} Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.064416 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.073704 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r4jqj" event={"ID":"3535eec4-3c32-4498-9c38-fbb7a5c77ee8","Type":"ContainerDied","Data":"4c134446889cdcca3988e8e7afbf6fdb5eae635e176b54e4c5aea1b608efdbe5"} Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.073873 4816 scope.go:117] "RemoveContainer" containerID="38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.080182 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" podStartSLOduration=3.080158443 podStartE2EDuration="3.080158443s" podCreationTimestamp="2026-03-11 12:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:13.07798999 +0000 UTC m=+1119.669253957" watchObservedRunningTime="2026-03-11 12:17:13.080158443 +0000 UTC m=+1119.671422410" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.105132 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.110891 4816 scope.go:117] "RemoveContainer" containerID="a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.111308 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r4jqj"] Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.126330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.131745 4816 scope.go:117] "RemoveContainer" containerID="38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.132346 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca\": container with ID starting with 38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca not found: ID does not exist" containerID="38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.132380 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca"} err="failed to get container status \"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca\": rpc error: code = NotFound desc = could not find container \"38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca\": container with ID starting with 38f59cff7cee44ddf93b07b0aa796cce7779bf8ab96c18eea85a54c7390532ca not found: ID does not exist" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.132404 4816 scope.go:117] "RemoveContainer" containerID="a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922" Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.132849 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922\": container with ID starting with a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922 not found: ID does not exist" containerID="a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.132892 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922"} err="failed to get container status \"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922\": rpc error: code = NotFound desc = could not find container \"a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922\": container with ID starting with a566ebb729cdf6c830abc577e85638221c81263efde591e4eeabf9e436cd0922 not found: ID does not exist" Mar 11 12:17:13 crc kubenswrapper[4816]: W0311 12:17:13.323399 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fca72cd_9caa_4029_8c20_1623a315702d.slice/crio-33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5 WatchSource:0}: Error finding container 33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5: Status 404 returned error can't find the container with id 33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5 Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.323610 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.337885 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.338421 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="init" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.338446 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="init" Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.338480 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="dnsmasq-dns" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.338487 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="dnsmasq-dns" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.338700 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" containerName="dnsmasq-dns" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.339704 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.345815 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.346147 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.346547 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.346771 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.347497 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-chwg7" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448218 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448373 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448404 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448429 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448469 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448494 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.448556 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.550851 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.550935 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.550963 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551001 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551029 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.551212 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.551556 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.551633 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:13 crc kubenswrapper[4816]: E0311 12:17:13.551803 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:15.551746543 +0000 UTC m=+1122.143010650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.552013 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.552075 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.552628 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.560099 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.560352 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.560531 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.575083 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") pod \"ovn-northd-0\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " pod="openstack/ovn-northd-0" Mar 11 12:17:13 crc kubenswrapper[4816]: I0311 12:17:13.667571 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 12:17:14 crc kubenswrapper[4816]: I0311 12:17:14.088705 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9nggr" event={"ID":"7fca72cd-9caa-4029-8c20-1623a315702d","Type":"ContainerStarted","Data":"33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5"} Mar 11 12:17:14 crc kubenswrapper[4816]: I0311 12:17:14.143709 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3535eec4-3c32-4498-9c38-fbb7a5c77ee8" path="/var/lib/kubelet/pods/3535eec4-3c32-4498-9c38-fbb7a5c77ee8/volumes" Mar 11 12:17:14 crc kubenswrapper[4816]: I0311 12:17:14.217519 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:17:15 crc kubenswrapper[4816]: I0311 12:17:15.099119 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerStarted","Data":"25d4f9ece0205331680bd83d3d312fa201b0497bc9a8a61346652664c99b99e2"} Mar 11 12:17:15 crc kubenswrapper[4816]: I0311 12:17:15.593379 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:15 crc kubenswrapper[4816]: E0311 12:17:15.593572 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:15 crc kubenswrapper[4816]: E0311 12:17:15.593598 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:15 crc kubenswrapper[4816]: E0311 12:17:15.593652 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:19.59363499 +0000 UTC m=+1126.184898957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:16 crc kubenswrapper[4816]: I0311 12:17:16.916566 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 11 12:17:16 crc kubenswrapper[4816]: I0311 12:17:16.916903 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 11 12:17:17 crc kubenswrapper[4816]: I0311 12:17:17.007974 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 11 12:17:17 crc kubenswrapper[4816]: I0311 12:17:17.221786 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.129386 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerStarted","Data":"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7"} Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.129961 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerStarted","Data":"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7"} Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.144042 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9nggr" event={"ID":"7fca72cd-9caa-4029-8c20-1623a315702d","Type":"ContainerStarted","Data":"c460fb14090c9d550203cf386e04b06e3563514702df072e42ad5fc80f7e1872"} Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.167193 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9nggr" podStartSLOduration=2.014346273 podStartE2EDuration="6.167153622s" podCreationTimestamp="2026-03-11 12:17:12 +0000 UTC" firstStartedPulling="2026-03-11 12:17:13.329079115 +0000 UTC m=+1119.920343072" lastFinishedPulling="2026-03-11 12:17:17.481886454 +0000 UTC m=+1124.073150421" observedRunningTime="2026-03-11 12:17:18.162065554 +0000 UTC m=+1124.753329531" watchObservedRunningTime="2026-03-11 12:17:18.167153622 +0000 UTC m=+1124.758417589" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.458533 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.458759 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.551177 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.941218 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.945510 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.949468 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 11 12:17:18 crc kubenswrapper[4816]: I0311 12:17:18.971038 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:18.999973 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.001732 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.013705 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.069610 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.069683 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.069769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.069925 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.167052 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.904639905 podStartE2EDuration="6.167033739s" podCreationTimestamp="2026-03-11 12:17:13 +0000 UTC" firstStartedPulling="2026-03-11 12:17:14.221515569 +0000 UTC m=+1120.812779536" lastFinishedPulling="2026-03-11 12:17:17.483909403 +0000 UTC m=+1124.075173370" observedRunningTime="2026-03-11 12:17:19.160210081 +0000 UTC m=+1125.751474048" watchObservedRunningTime="2026-03-11 12:17:19.167033739 +0000 UTC m=+1125.758297706" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.171951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.172433 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.172533 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.172576 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.172820 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.176454 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.201160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") pod \"glance-db-create-l5lds\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.201693 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") pod \"glance-c8d3-account-create-update-85zqd\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.234337 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.276607 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.324963 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l5lds" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.503763 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.506299 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.523149 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.589038 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.589125 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.613357 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.614539 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.620398 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.631471 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691018 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691085 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691153 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691420 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.691468 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: E0311 12:17:19.691870 4816 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 12:17:19 crc kubenswrapper[4816]: E0311 12:17:19.691900 4816 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 12:17:19 crc kubenswrapper[4816]: E0311 12:17:19.691970 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift podName:485f9fbd-e0ca-472d-b97c-87c127253a96 nodeName:}" failed. No retries permitted until 2026-03-11 12:17:27.691947188 +0000 UTC m=+1134.283211155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift") pod "swift-storage-0" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96") : configmap "swift-ring-files" not found Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.692504 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.718941 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") pod \"keystone-db-create-rmcqp\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.727750 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.729896 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.749680 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.793618 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.793682 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.793737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.793859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.794686 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.798675 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.799845 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.801961 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.813886 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.816668 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") pod \"keystone-9b21-account-create-update-r8vgg\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.851539 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.858211 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.895999 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.896094 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.896146 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.896191 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.897360 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.916376 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") pod \"placement-db-create-85nd9\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " pod="openstack/placement-db-create-85nd9" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.945193 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.998585 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:19 crc kubenswrapper[4816]: I0311 12:17:19.998665 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.000197 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.022922 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.049952 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") pod \"placement-3c3c-account-create-update-2whdq\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.089764 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-85nd9" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.119394 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.162897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l5lds" event={"ID":"9fd32333-bdaa-461b-ac10-324291d1e5d3","Type":"ContainerStarted","Data":"3ceb41c8ea2fba7175551e2e2e287690c5e419936c11d2017fa5a383da9d61fd"} Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.170805 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c8d3-account-create-update-85zqd" event={"ID":"288dd774-6e04-45d2-b786-c7f2be7fbeae","Type":"ContainerStarted","Data":"5140353c5c6034db1623dc2f3c189d72ec962703a0a91d22d2e279ead073afac"} Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.170894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c8d3-account-create-update-85zqd" event={"ID":"288dd774-6e04-45d2-b786-c7f2be7fbeae","Type":"ContainerStarted","Data":"d08403da7f71b924e48ef9d0d5d10621dca1ae09e43b0f9bc4c8d1ae6cf47de1"} Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.196069 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-c8d3-account-create-update-85zqd" podStartSLOduration=2.196010802 podStartE2EDuration="2.196010802s" podCreationTimestamp="2026-03-11 12:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:20.190592654 +0000 UTC m=+1126.781856631" watchObservedRunningTime="2026-03-11 12:17:20.196010802 +0000 UTC m=+1126.787274769" Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.366320 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.584802 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:17:20 crc kubenswrapper[4816]: W0311 12:17:20.607344 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod625b367b_084e_4cf8_8c30_5d4df9c696f9.slice/crio-b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080 WatchSource:0}: Error finding container b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080: Status 404 returned error can't find the container with id b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080 Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.686294 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:17:20 crc kubenswrapper[4816]: W0311 12:17:20.742633 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d3e7fa1_3f66_495b_be44_cf97eec043c1.slice/crio-325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77 WatchSource:0}: Error finding container 325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77: Status 404 returned error can't find the container with id 325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77 Mar 11 12:17:20 crc kubenswrapper[4816]: I0311 12:17:20.842441 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:17:20 crc kubenswrapper[4816]: W0311 12:17:20.846279 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod632c5d32_5370_401a_8202_58e0ec70f357.slice/crio-b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7 WatchSource:0}: Error finding container b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7: Status 404 returned error can't find the container with id b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7 Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.101507 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.186534 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rmcqp" event={"ID":"742cfc03-0365-4df8-a7f6-e6eac11ba045","Type":"ContainerStarted","Data":"d6e7d3be2f695e55ef6abf84c83d060683eae93e0020c12fe8744829cbcc1d6a"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.187085 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rmcqp" event={"ID":"742cfc03-0365-4df8-a7f6-e6eac11ba045","Type":"ContainerStarted","Data":"72f8ee81a2c1316c277ebe03c1377b981929e7b088dea5cad65f5821f4e7a02b"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.189482 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.189730 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="dnsmasq-dns" containerID="cri-o://f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" gracePeriod=10 Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.197702 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c3c-account-create-update-2whdq" event={"ID":"632c5d32-5370-401a-8202-58e0ec70f357","Type":"ContainerStarted","Data":"8c240088bce92d648a44cfc826778c591f6601fbc70cdbc9325a1348704e1a92"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.197781 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c3c-account-create-update-2whdq" event={"ID":"632c5d32-5370-401a-8202-58e0ec70f357","Type":"ContainerStarted","Data":"b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.216799 4816 generic.go:334] "Generic (PLEG): container finished" podID="9fd32333-bdaa-461b-ac10-324291d1e5d3" containerID="a8f8ba02ac608528a8da635158a48ff55377bd4734bbd746e513b637d5d907d3" exitCode=0 Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.216918 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l5lds" event={"ID":"9fd32333-bdaa-461b-ac10-324291d1e5d3","Type":"ContainerDied","Data":"a8f8ba02ac608528a8da635158a48ff55377bd4734bbd746e513b637d5d907d3"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.229621 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-85nd9" event={"ID":"8d3e7fa1-3f66-495b-be44-cf97eec043c1","Type":"ContainerStarted","Data":"ab6525891e160f8b83901124157238a30564c85220f9440c25fb3222634839c7"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.229719 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-85nd9" event={"ID":"8d3e7fa1-3f66-495b-be44-cf97eec043c1","Type":"ContainerStarted","Data":"325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.241665 4816 generic.go:334] "Generic (PLEG): container finished" podID="288dd774-6e04-45d2-b786-c7f2be7fbeae" containerID="5140353c5c6034db1623dc2f3c189d72ec962703a0a91d22d2e279ead073afac" exitCode=0 Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.241746 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c8d3-account-create-update-85zqd" event={"ID":"288dd774-6e04-45d2-b786-c7f2be7fbeae","Type":"ContainerDied","Data":"5140353c5c6034db1623dc2f3c189d72ec962703a0a91d22d2e279ead073afac"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.246925 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-rmcqp" podStartSLOduration=2.246893749 podStartE2EDuration="2.246893749s" podCreationTimestamp="2026-03-11 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:21.232723508 +0000 UTC m=+1127.823987475" watchObservedRunningTime="2026-03-11 12:17:21.246893749 +0000 UTC m=+1127.838157716" Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.263104 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b21-account-create-update-r8vgg" event={"ID":"625b367b-084e-4cf8-8c30-5d4df9c696f9","Type":"ContainerStarted","Data":"d63f60636f6e53982a24004e405c34ed67500a9193f04b98e8d29856c8e89ee2"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.263165 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b21-account-create-update-r8vgg" event={"ID":"625b367b-084e-4cf8-8c30-5d4df9c696f9","Type":"ContainerStarted","Data":"b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080"} Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.271396 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-85nd9" podStartSLOduration=2.271383421 podStartE2EDuration="2.271383421s" podCreationTimestamp="2026-03-11 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:21.268677272 +0000 UTC m=+1127.859941239" watchObservedRunningTime="2026-03-11 12:17:21.271383421 +0000 UTC m=+1127.862647388" Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.342959 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3c3c-account-create-update-2whdq" podStartSLOduration=2.34293672 podStartE2EDuration="2.34293672s" podCreationTimestamp="2026-03-11 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:21.334476084 +0000 UTC m=+1127.925740041" watchObservedRunningTime="2026-03-11 12:17:21.34293672 +0000 UTC m=+1127.934200687" Mar 11 12:17:21 crc kubenswrapper[4816]: I0311 12:17:21.374236 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9b21-account-create-update-r8vgg" podStartSLOduration=2.374210588 podStartE2EDuration="2.374210588s" podCreationTimestamp="2026-03-11 12:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:21.369636575 +0000 UTC m=+1127.960900552" watchObservedRunningTime="2026-03-11 12:17:21.374210588 +0000 UTC m=+1127.965474555" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.197200 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.256815 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") pod \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.257007 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") pod \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.257035 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") pod \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\" (UID: \"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.267367 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2" (OuterVolumeSpecName: "kube-api-access-hw7f2") pod "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" (UID: "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5"). InnerVolumeSpecName "kube-api-access-hw7f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.274441 4816 generic.go:334] "Generic (PLEG): container finished" podID="742cfc03-0365-4df8-a7f6-e6eac11ba045" containerID="d6e7d3be2f695e55ef6abf84c83d060683eae93e0020c12fe8744829cbcc1d6a" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.274853 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rmcqp" event={"ID":"742cfc03-0365-4df8-a7f6-e6eac11ba045","Type":"ContainerDied","Data":"d6e7d3be2f695e55ef6abf84c83d060683eae93e0020c12fe8744829cbcc1d6a"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.279379 4816 generic.go:334] "Generic (PLEG): container finished" podID="632c5d32-5370-401a-8202-58e0ec70f357" containerID="8c240088bce92d648a44cfc826778c591f6601fbc70cdbc9325a1348704e1a92" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.279429 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c3c-account-create-update-2whdq" event={"ID":"632c5d32-5370-401a-8202-58e0ec70f357","Type":"ContainerDied","Data":"8c240088bce92d648a44cfc826778c591f6601fbc70cdbc9325a1348704e1a92"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281160 4816 generic.go:334] "Generic (PLEG): container finished" podID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerID="f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281290 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281519 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerDied","Data":"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281681 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-ngbb2" event={"ID":"1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5","Type":"ContainerDied","Data":"c77801a330291e43dbd716f0eaa0018246a0046f3623f314fd24c49712949d82"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.281813 4816 scope.go:117] "RemoveContainer" containerID="f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.292118 4816 generic.go:334] "Generic (PLEG): container finished" podID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" containerID="ab6525891e160f8b83901124157238a30564c85220f9440c25fb3222634839c7" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.292210 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-85nd9" event={"ID":"8d3e7fa1-3f66-495b-be44-cf97eec043c1","Type":"ContainerDied","Data":"ab6525891e160f8b83901124157238a30564c85220f9440c25fb3222634839c7"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.311423 4816 generic.go:334] "Generic (PLEG): container finished" podID="625b367b-084e-4cf8-8c30-5d4df9c696f9" containerID="d63f60636f6e53982a24004e405c34ed67500a9193f04b98e8d29856c8e89ee2" exitCode=0 Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.311760 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b21-account-create-update-r8vgg" event={"ID":"625b367b-084e-4cf8-8c30-5d4df9c696f9","Type":"ContainerDied","Data":"d63f60636f6e53982a24004e405c34ed67500a9193f04b98e8d29856c8e89ee2"} Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.328060 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config" (OuterVolumeSpecName: "config") pod "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" (UID: "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.328076 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" (UID: "1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.359558 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.359595 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.359608 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw7f2\" (UniqueName: \"kubernetes.io/projected/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5-kube-api-access-hw7f2\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.383002 4816 scope.go:117] "RemoveContainer" containerID="c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.404863 4816 scope.go:117] "RemoveContainer" containerID="f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" Mar 11 12:17:22 crc kubenswrapper[4816]: E0311 12:17:22.405367 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd\": container with ID starting with f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd not found: ID does not exist" containerID="f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.405401 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd"} err="failed to get container status \"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd\": rpc error: code = NotFound desc = could not find container \"f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd\": container with ID starting with f1aa8d74efb7ddd77f41c0a07405cadc0d7b5b49051c87e006b24963f08b09cd not found: ID does not exist" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.405464 4816 scope.go:117] "RemoveContainer" containerID="c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e" Mar 11 12:17:22 crc kubenswrapper[4816]: E0311 12:17:22.406489 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e\": container with ID starting with c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e not found: ID does not exist" containerID="c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.406532 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e"} err="failed to get container status \"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e\": rpc error: code = NotFound desc = could not find container \"c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e\": container with ID starting with c533534709dab6e1ab18ea2fc02a9a4aef8083ea39ecbf0e3914dc859849d48e not found: ID does not exist" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.664033 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l5lds" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.679300 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.694352 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-ngbb2"] Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.768215 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") pod \"9fd32333-bdaa-461b-ac10-324291d1e5d3\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.768623 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") pod \"9fd32333-bdaa-461b-ac10-324291d1e5d3\" (UID: \"9fd32333-bdaa-461b-ac10-324291d1e5d3\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.772077 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9fd32333-bdaa-461b-ac10-324291d1e5d3" (UID: "9fd32333-bdaa-461b-ac10-324291d1e5d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.778794 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt" (OuterVolumeSpecName: "kube-api-access-mcqxt") pod "9fd32333-bdaa-461b-ac10-324291d1e5d3" (UID: "9fd32333-bdaa-461b-ac10-324291d1e5d3"). InnerVolumeSpecName "kube-api-access-mcqxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.820113 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.872748 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") pod \"288dd774-6e04-45d2-b786-c7f2be7fbeae\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.872935 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") pod \"288dd774-6e04-45d2-b786-c7f2be7fbeae\" (UID: \"288dd774-6e04-45d2-b786-c7f2be7fbeae\") " Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.873438 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcqxt\" (UniqueName: \"kubernetes.io/projected/9fd32333-bdaa-461b-ac10-324291d1e5d3-kube-api-access-mcqxt\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.873458 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd32333-bdaa-461b-ac10-324291d1e5d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.874178 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "288dd774-6e04-45d2-b786-c7f2be7fbeae" (UID: "288dd774-6e04-45d2-b786-c7f2be7fbeae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.878381 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn" (OuterVolumeSpecName: "kube-api-access-2f9qn") pod "288dd774-6e04-45d2-b786-c7f2be7fbeae" (UID: "288dd774-6e04-45d2-b786-c7f2be7fbeae"). InnerVolumeSpecName "kube-api-access-2f9qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.976129 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f9qn\" (UniqueName: \"kubernetes.io/projected/288dd774-6e04-45d2-b786-c7f2be7fbeae-kube-api-access-2f9qn\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:22 crc kubenswrapper[4816]: I0311 12:17:22.976188 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288dd774-6e04-45d2-b786-c7f2be7fbeae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.324865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l5lds" event={"ID":"9fd32333-bdaa-461b-ac10-324291d1e5d3","Type":"ContainerDied","Data":"3ceb41c8ea2fba7175551e2e2e287690c5e419936c11d2017fa5a383da9d61fd"} Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.324936 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ceb41c8ea2fba7175551e2e2e287690c5e419936c11d2017fa5a383da9d61fd" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.325030 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l5lds" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.329828 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c8d3-account-create-update-85zqd" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.329879 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c8d3-account-create-update-85zqd" event={"ID":"288dd774-6e04-45d2-b786-c7f2be7fbeae","Type":"ContainerDied","Data":"d08403da7f71b924e48ef9d0d5d10621dca1ae09e43b0f9bc4c8d1ae6cf47de1"} Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.329963 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d08403da7f71b924e48ef9d0d5d10621dca1ae09e43b0f9bc4c8d1ae6cf47de1" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.671503 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.822002 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.884729 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.906711 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") pod \"742cfc03-0365-4df8-a7f6-e6eac11ba045\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.907047 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") pod \"742cfc03-0365-4df8-a7f6-e6eac11ba045\" (UID: \"742cfc03-0365-4df8-a7f6-e6eac11ba045\") " Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.907692 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "742cfc03-0365-4df8-a7f6-e6eac11ba045" (UID: "742cfc03-0365-4df8-a7f6-e6eac11ba045"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.908735 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/742cfc03-0365-4df8-a7f6-e6eac11ba045-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.913498 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt" (OuterVolumeSpecName: "kube-api-access-bccjt") pod "742cfc03-0365-4df8-a7f6-e6eac11ba045" (UID: "742cfc03-0365-4df8-a7f6-e6eac11ba045"). InnerVolumeSpecName "kube-api-access-bccjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.928688 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-85nd9" Mar 11 12:17:23 crc kubenswrapper[4816]: I0311 12:17:23.932696 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010002 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") pod \"632c5d32-5370-401a-8202-58e0ec70f357\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010176 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") pod \"625b367b-084e-4cf8-8c30-5d4df9c696f9\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010292 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") pod \"625b367b-084e-4cf8-8c30-5d4df9c696f9\" (UID: \"625b367b-084e-4cf8-8c30-5d4df9c696f9\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") pod \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010378 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") pod \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\" (UID: \"8d3e7fa1-3f66-495b-be44-cf97eec043c1\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010412 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") pod \"632c5d32-5370-401a-8202-58e0ec70f357\" (UID: \"632c5d32-5370-401a-8202-58e0ec70f357\") " Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.010774 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bccjt\" (UniqueName: \"kubernetes.io/projected/742cfc03-0365-4df8-a7f6-e6eac11ba045-kube-api-access-bccjt\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.011309 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "625b367b-084e-4cf8-8c30-5d4df9c696f9" (UID: "625b367b-084e-4cf8-8c30-5d4df9c696f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.011515 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "632c5d32-5370-401a-8202-58e0ec70f357" (UID: "632c5d32-5370-401a-8202-58e0ec70f357"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.011728 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d3e7fa1-3f66-495b-be44-cf97eec043c1" (UID: "8d3e7fa1-3f66-495b-be44-cf97eec043c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.014598 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc" (OuterVolumeSpecName: "kube-api-access-c2mbc") pod "625b367b-084e-4cf8-8c30-5d4df9c696f9" (UID: "625b367b-084e-4cf8-8c30-5d4df9c696f9"). InnerVolumeSpecName "kube-api-access-c2mbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.014671 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5" (OuterVolumeSpecName: "kube-api-access-gk5p5") pod "632c5d32-5370-401a-8202-58e0ec70f357" (UID: "632c5d32-5370-401a-8202-58e0ec70f357"). InnerVolumeSpecName "kube-api-access-gk5p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.015121 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b" (OuterVolumeSpecName: "kube-api-access-x4p6b") pod "8d3e7fa1-3f66-495b-be44-cf97eec043c1" (UID: "8d3e7fa1-3f66-495b-be44-cf97eec043c1"). InnerVolumeSpecName "kube-api-access-x4p6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112865 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625b367b-084e-4cf8-8c30-5d4df9c696f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112928 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4p6b\" (UniqueName: \"kubernetes.io/projected/8d3e7fa1-3f66-495b-be44-cf97eec043c1-kube-api-access-x4p6b\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112946 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d3e7fa1-3f66-495b-be44-cf97eec043c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112959 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/632c5d32-5370-401a-8202-58e0ec70f357-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112973 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk5p5\" (UniqueName: \"kubernetes.io/projected/632c5d32-5370-401a-8202-58e0ec70f357-kube-api-access-gk5p5\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.112986 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2mbc\" (UniqueName: \"kubernetes.io/projected/625b367b-084e-4cf8-8c30-5d4df9c696f9-kube-api-access-c2mbc\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.180664 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" path="/var/lib/kubelet/pods/1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5/volumes" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.272921 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273532 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd32333-bdaa-461b-ac10-324291d1e5d3" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273562 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd32333-bdaa-461b-ac10-324291d1e5d3" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273591 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625b367b-084e-4cf8-8c30-5d4df9c696f9" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273602 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="625b367b-084e-4cf8-8c30-5d4df9c696f9" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273617 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="init" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273629 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="init" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273655 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273663 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273675 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="dnsmasq-dns" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273685 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="dnsmasq-dns" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273702 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632c5d32-5370-401a-8202-58e0ec70f357" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273710 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="632c5d32-5370-401a-8202-58e0ec70f357" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273721 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742cfc03-0365-4df8-a7f6-e6eac11ba045" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273731 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="742cfc03-0365-4df8-a7f6-e6eac11ba045" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: E0311 12:17:24.273750 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288dd774-6e04-45d2-b786-c7f2be7fbeae" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273759 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="288dd774-6e04-45d2-b786-c7f2be7fbeae" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.273986 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274014 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="625b367b-084e-4cf8-8c30-5d4df9c696f9" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274028 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="632c5d32-5370-401a-8202-58e0ec70f357" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274039 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="288dd774-6e04-45d2-b786-c7f2be7fbeae" containerName="mariadb-account-create-update" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274051 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd32333-bdaa-461b-ac10-324291d1e5d3" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274063 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe27f2c-fcd4-42f9-8d14-9ad29dbf86b5" containerName="dnsmasq-dns" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.274074 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="742cfc03-0365-4df8-a7f6-e6eac11ba045" containerName="mariadb-database-create" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.275379 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.278768 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-22dm7" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.279893 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.295957 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.338832 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rmcqp" event={"ID":"742cfc03-0365-4df8-a7f6-e6eac11ba045","Type":"ContainerDied","Data":"72f8ee81a2c1316c277ebe03c1377b981929e7b088dea5cad65f5821f4e7a02b"} Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.338891 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f8ee81a2c1316c277ebe03c1377b981929e7b088dea5cad65f5821f4e7a02b" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.338889 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rmcqp" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.340746 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c3c-account-create-update-2whdq" event={"ID":"632c5d32-5370-401a-8202-58e0ec70f357","Type":"ContainerDied","Data":"b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7"} Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.340791 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c3c-account-create-update-2whdq" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.340808 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82ed4a76542786e4b16c41e7a02f7bd83269f45c60c92a6fba6442f53946ac7" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.342364 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-85nd9" event={"ID":"8d3e7fa1-3f66-495b-be44-cf97eec043c1","Type":"ContainerDied","Data":"325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77"} Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.342418 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="325c8645905fb2092da099ab41a3a90f525a5fd495df91580bbe9c8dfd427b77" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.342530 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-85nd9" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.348371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b21-account-create-update-r8vgg" event={"ID":"625b367b-084e-4cf8-8c30-5d4df9c696f9","Type":"ContainerDied","Data":"b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080"} Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.348424 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1d1b3a696ed28345a8a58187f340852c23ee5c0f7a319f65f9096cd2efaa080" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.348475 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-r8vgg" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.419955 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.420103 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.420138 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.420166 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.521527 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.521678 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.521714 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.521747 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.527212 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.527557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.528232 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.539478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") pod \"glance-db-sync-n98v5\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:24 crc kubenswrapper[4816]: I0311 12:17:24.596881 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n98v5" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.024290 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.358579 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n98v5" event={"ID":"b6745bae-b403-4a86-9148-8baecc00f8b1","Type":"ContainerStarted","Data":"7ade065f6f708de586323f677e56810ada0b99da337e5a079b57da2cc0b0b5a0"} Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.571570 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.572789 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.575435 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.585307 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.642860 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.643076 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.744422 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.744587 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.745734 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.774225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") pod \"root-account-create-update-9mq9q\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:25 crc kubenswrapper[4816]: I0311 12:17:25.898476 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:26 crc kubenswrapper[4816]: I0311 12:17:26.385017 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fca72cd-9caa-4029-8c20-1623a315702d" containerID="c460fb14090c9d550203cf386e04b06e3563514702df072e42ad5fc80f7e1872" exitCode=0 Mar 11 12:17:26 crc kubenswrapper[4816]: I0311 12:17:26.385085 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9nggr" event={"ID":"7fca72cd-9caa-4029-8c20-1623a315702d","Type":"ContainerDied","Data":"c460fb14090c9d550203cf386e04b06e3563514702df072e42ad5fc80f7e1872"} Mar 11 12:17:26 crc kubenswrapper[4816]: I0311 12:17:26.436461 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:26 crc kubenswrapper[4816]: W0311 12:17:26.442078 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2da861f5_2cc3_402f_aca5_afbce135baaa.slice/crio-4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b WatchSource:0}: Error finding container 4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b: Status 404 returned error can't find the container with id 4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.397071 4816 generic.go:334] "Generic (PLEG): container finished" podID="2da861f5-2cc3-402f-aca5-afbce135baaa" containerID="0de73c3da519dc3d23fdd410a58406f0ff5aec8f4b5e6483b5c4a546f3b60ef0" exitCode=0 Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.397626 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9mq9q" event={"ID":"2da861f5-2cc3-402f-aca5-afbce135baaa","Type":"ContainerDied","Data":"0de73c3da519dc3d23fdd410a58406f0ff5aec8f4b5e6483b5c4a546f3b60ef0"} Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.397667 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9mq9q" event={"ID":"2da861f5-2cc3-402f-aca5-afbce135baaa","Type":"ContainerStarted","Data":"4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b"} Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.703424 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.712271 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"swift-storage-0\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " pod="openstack/swift-storage-0" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.766112 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.820958 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.906591 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.906670 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.907004 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.907081 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.907129 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.908365 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.908412 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") pod \"7fca72cd-9caa-4029-8c20-1623a315702d\" (UID: \"7fca72cd-9caa-4029-8c20-1623a315702d\") " Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.908914 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.910490 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.919045 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42" (OuterVolumeSpecName: "kube-api-access-kvz42") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "kube-api-access-kvz42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.922087 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.942795 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.944004 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts" (OuterVolumeSpecName: "scripts") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:27 crc kubenswrapper[4816]: I0311 12:17:27.945544 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fca72cd-9caa-4029-8c20-1623a315702d" (UID: "7fca72cd-9caa-4029-8c20-1623a315702d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010592 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvz42\" (UniqueName: \"kubernetes.io/projected/7fca72cd-9caa-4029-8c20-1623a315702d-kube-api-access-kvz42\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010632 4816 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010642 4816 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7fca72cd-9caa-4029-8c20-1623a315702d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010651 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fca72cd-9caa-4029-8c20-1623a315702d-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010661 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010669 4816 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.010677 4816 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7fca72cd-9caa-4029-8c20-1623a315702d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.368238 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:17:28 crc kubenswrapper[4816]: W0311 12:17:28.379461 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod485f9fbd_e0ca_472d_b97c_87c127253a96.slice/crio-bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770 WatchSource:0}: Error finding container bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770: Status 404 returned error can't find the container with id bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770 Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.411521 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9nggr" event={"ID":"7fca72cd-9caa-4029-8c20-1623a315702d","Type":"ContainerDied","Data":"33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5"} Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.411563 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b17ac615d74325a9263091c6d521ccba5681421913cd3808e6a592677fe4c5" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.411574 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9nggr" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.413894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770"} Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.747577 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.830462 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") pod \"2da861f5-2cc3-402f-aca5-afbce135baaa\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.830771 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") pod \"2da861f5-2cc3-402f-aca5-afbce135baaa\" (UID: \"2da861f5-2cc3-402f-aca5-afbce135baaa\") " Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.831302 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2da861f5-2cc3-402f-aca5-afbce135baaa" (UID: "2da861f5-2cc3-402f-aca5-afbce135baaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.836565 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8" (OuterVolumeSpecName: "kube-api-access-qnkj8") pod "2da861f5-2cc3-402f-aca5-afbce135baaa" (UID: "2da861f5-2cc3-402f-aca5-afbce135baaa"). InnerVolumeSpecName "kube-api-access-qnkj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.935341 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnkj8\" (UniqueName: \"kubernetes.io/projected/2da861f5-2cc3-402f-aca5-afbce135baaa-kube-api-access-qnkj8\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:28 crc kubenswrapper[4816]: I0311 12:17:28.935384 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2da861f5-2cc3-402f-aca5-afbce135baaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:29 crc kubenswrapper[4816]: I0311 12:17:29.426774 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9mq9q" event={"ID":"2da861f5-2cc3-402f-aca5-afbce135baaa","Type":"ContainerDied","Data":"4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b"} Mar 11 12:17:29 crc kubenswrapper[4816]: I0311 12:17:29.426828 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4becd7c4791fdd33ec56c96adc3b97cb604d2b1821c1e588fbfa3b9a0ae5597b" Mar 11 12:17:29 crc kubenswrapper[4816]: I0311 12:17:29.426858 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9mq9q" Mar 11 12:17:30 crc kubenswrapper[4816]: I0311 12:17:30.441855 4816 generic.go:334] "Generic (PLEG): container finished" podID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerID="522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e" exitCode=0 Mar 11 12:17:30 crc kubenswrapper[4816]: I0311 12:17:30.441931 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerDied","Data":"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e"} Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.039599 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.048110 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9mq9q"] Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.140029 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da861f5-2cc3-402f-aca5-afbce135baaa" path="/var/lib/kubelet/pods/2da861f5-2cc3-402f-aca5-afbce135baaa/volumes" Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.458077 4816 generic.go:334] "Generic (PLEG): container finished" podID="26aea2df-f497-478d-b953-060189ef2569" containerID="47287b2bd213321105c729d451b069f02c0e309af3b5c9c84b7b9c24acc1a5f3" exitCode=0 Mar 11 12:17:32 crc kubenswrapper[4816]: I0311 12:17:32.458132 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerDied","Data":"47287b2bd213321105c729d451b069f02c0e309af3b5c9c84b7b9c24acc1a5f3"} Mar 11 12:17:33 crc kubenswrapper[4816]: I0311 12:17:33.748343 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 11 12:17:34 crc kubenswrapper[4816]: I0311 12:17:34.562644 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84rn8" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" probeResult="failure" output=< Mar 11 12:17:34 crc kubenswrapper[4816]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 11 12:17:34 crc kubenswrapper[4816]: > Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.058414 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:17:37 crc kubenswrapper[4816]: E0311 12:17:37.059031 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da861f5-2cc3-402f-aca5-afbce135baaa" containerName="mariadb-account-create-update" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.059047 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da861f5-2cc3-402f-aca5-afbce135baaa" containerName="mariadb-account-create-update" Mar 11 12:17:37 crc kubenswrapper[4816]: E0311 12:17:37.059077 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca72cd-9caa-4029-8c20-1623a315702d" containerName="swift-ring-rebalance" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.059086 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca72cd-9caa-4029-8c20-1623a315702d" containerName="swift-ring-rebalance" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.059305 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fca72cd-9caa-4029-8c20-1623a315702d" containerName="swift-ring-rebalance" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.059323 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da861f5-2cc3-402f-aca5-afbce135baaa" containerName="mariadb-account-create-update" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.060045 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.065138 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.070931 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.120906 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.121046 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.224873 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.224926 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.225944 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.259985 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") pod \"root-account-create-update-7l6hp\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:37 crc kubenswrapper[4816]: I0311 12:17:37.385227 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.368560 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.370517 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.556158 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84rn8" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" probeResult="failure" output=< Mar 11 12:17:39 crc kubenswrapper[4816]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 11 12:17:39 crc kubenswrapper[4816]: > Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.620103 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.621303 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.623055 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.648983 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670212 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670270 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670343 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670369 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.670460 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772029 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772100 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772130 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772225 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772279 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772666 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772728 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772737 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.772941 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.773778 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.774212 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.794328 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") pod \"ovn-controller-84rn8-config-gh5kb\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:39 crc kubenswrapper[4816]: I0311 12:17:39.968283 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:43 crc kubenswrapper[4816]: E0311 12:17:43.241523 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120" Mar 11 12:17:43 crc kubenswrapper[4816]: E0311 12:17:43.242118 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pghjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-n98v5_openstack(b6745bae-b403-4a86-9148-8baecc00f8b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:17:43 crc kubenswrapper[4816]: E0311 12:17:43.243778 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-n98v5" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.554791 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerStarted","Data":"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f"} Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.555584 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:17:43 crc kubenswrapper[4816]: E0311 12:17:43.556791 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120\\\"\"" pod="openstack/glance-db-sync-n98v5" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.592419 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.005836596 podStartE2EDuration="1m9.592405472s" podCreationTimestamp="2026-03-11 12:16:34 +0000 UTC" firstStartedPulling="2026-03-11 12:16:36.16715988 +0000 UTC m=+1082.758423847" lastFinishedPulling="2026-03-11 12:16:55.753728756 +0000 UTC m=+1102.344992723" observedRunningTime="2026-03-11 12:17:43.590555118 +0000 UTC m=+1150.181819085" watchObservedRunningTime="2026-03-11 12:17:43.592405472 +0000 UTC m=+1150.183669429" Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.819115 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:43 crc kubenswrapper[4816]: W0311 12:17:43.825937 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb02a85f_cc39_4119_a962_4b4fd66c015d.slice/crio-442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35 WatchSource:0}: Error finding container 442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35: Status 404 returned error can't find the container with id 442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35 Mar 11 12:17:43 crc kubenswrapper[4816]: I0311 12:17:43.963777 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.552781 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-84rn8" Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.564715 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8-config-gh5kb" event={"ID":"eb02a85f-cc39-4119-a962-4b4fd66c015d","Type":"ContainerStarted","Data":"c9628d19e1e9c78361e9677b8afa40ad86295ad47aa0110e4b51ead3233c90ca"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.564760 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8-config-gh5kb" event={"ID":"eb02a85f-cc39-4119-a962-4b4fd66c015d","Type":"ContainerStarted","Data":"442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.572474 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.572550 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.572566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.572577 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.574163 4816 generic.go:334] "Generic (PLEG): container finished" podID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" containerID="3dfe4dd28e66c33830345db1226180f842f3ae3d59f4fa3a4c553af39dd07c67" exitCode=0 Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.574235 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7l6hp" event={"ID":"ee11077d-39aa-44c4-9cf3-a8a80647bc50","Type":"ContainerDied","Data":"3dfe4dd28e66c33830345db1226180f842f3ae3d59f4fa3a4c553af39dd07c67"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.574369 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7l6hp" event={"ID":"ee11077d-39aa-44c4-9cf3-a8a80647bc50","Type":"ContainerStarted","Data":"8a5daaf8f9d67a350db46e8d3c4baf6ef0a1efc9cf1a945115cf84cf82ad6093"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.581098 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerStarted","Data":"0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7"} Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.581576 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.603542 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-84rn8-config-gh5kb" podStartSLOduration=5.603525045 podStartE2EDuration="5.603525045s" podCreationTimestamp="2026-03-11 12:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:44.597330325 +0000 UTC m=+1151.188594292" watchObservedRunningTime="2026-03-11 12:17:44.603525045 +0000 UTC m=+1151.194789012" Mar 11 12:17:44 crc kubenswrapper[4816]: I0311 12:17:44.645580 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=70.645559727 podStartE2EDuration="1m10.645559727s" podCreationTimestamp="2026-03-11 12:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:44.640578092 +0000 UTC m=+1151.231842049" watchObservedRunningTime="2026-03-11 12:17:44.645559727 +0000 UTC m=+1151.236823694" Mar 11 12:17:45 crc kubenswrapper[4816]: I0311 12:17:45.596028 4816 generic.go:334] "Generic (PLEG): container finished" podID="eb02a85f-cc39-4119-a962-4b4fd66c015d" containerID="c9628d19e1e9c78361e9677b8afa40ad86295ad47aa0110e4b51ead3233c90ca" exitCode=0 Mar 11 12:17:45 crc kubenswrapper[4816]: I0311 12:17:45.596311 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8-config-gh5kb" event={"ID":"eb02a85f-cc39-4119-a962-4b4fd66c015d","Type":"ContainerDied","Data":"c9628d19e1e9c78361e9677b8afa40ad86295ad47aa0110e4b51ead3233c90ca"} Mar 11 12:17:45 crc kubenswrapper[4816]: I0311 12:17:45.953865 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.130046 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") pod \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.130657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee11077d-39aa-44c4-9cf3-a8a80647bc50" (UID: "ee11077d-39aa-44c4-9cf3-a8a80647bc50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.130861 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") pod \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\" (UID: \"ee11077d-39aa-44c4-9cf3-a8a80647bc50\") " Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.131899 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee11077d-39aa-44c4-9cf3-a8a80647bc50-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.136110 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s" (OuterVolumeSpecName: "kube-api-access-mw62s") pod "ee11077d-39aa-44c4-9cf3-a8a80647bc50" (UID: "ee11077d-39aa-44c4-9cf3-a8a80647bc50"). InnerVolumeSpecName "kube-api-access-mw62s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.235545 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw62s\" (UniqueName: \"kubernetes.io/projected/ee11077d-39aa-44c4-9cf3-a8a80647bc50-kube-api-access-mw62s\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.646556 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.646871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.646883 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.646894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.653000 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7l6hp" event={"ID":"ee11077d-39aa-44c4-9cf3-a8a80647bc50","Type":"ContainerDied","Data":"8a5daaf8f9d67a350db46e8d3c4baf6ef0a1efc9cf1a945115cf84cf82ad6093"} Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.653077 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5daaf8f9d67a350db46e8d3c4baf6ef0a1efc9cf1a945115cf84cf82ad6093" Mar 11 12:17:46 crc kubenswrapper[4816]: I0311 12:17:46.653180 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7l6hp" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.018979 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.152370 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.152917 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.152980 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153010 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153016 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run" (OuterVolumeSpecName: "var-run") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153067 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153079 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153106 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") pod \"eb02a85f-cc39-4119-a962-4b4fd66c015d\" (UID: \"eb02a85f-cc39-4119-a962-4b4fd66c015d\") " Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153092 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153632 4816 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153657 4816 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153672 4816 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb02a85f-cc39-4119-a962-4b4fd66c015d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153769 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.153909 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts" (OuterVolumeSpecName: "scripts") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.159652 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst" (OuterVolumeSpecName: "kube-api-access-ppkst") pod "eb02a85f-cc39-4119-a962-4b4fd66c015d" (UID: "eb02a85f-cc39-4119-a962-4b4fd66c015d"). InnerVolumeSpecName "kube-api-access-ppkst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.254920 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppkst\" (UniqueName: \"kubernetes.io/projected/eb02a85f-cc39-4119-a962-4b4fd66c015d-kube-api-access-ppkst\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.254958 4816 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.254967 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb02a85f-cc39-4119-a962-4b4fd66c015d-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.662229 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8-config-gh5kb" event={"ID":"eb02a85f-cc39-4119-a962-4b4fd66c015d","Type":"ContainerDied","Data":"442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35"} Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.662289 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="442c4ad71e60ee1fc52e7396b8d83c71aa66e6183cf09a9aee4eec94596dee35" Mar 11 12:17:47 crc kubenswrapper[4816]: I0311 12:17:47.662333 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8-config-gh5kb" Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.194094 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.204960 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-84rn8-config-gh5kb"] Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.687871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477"} Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.687911 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4"} Mar 11 12:17:48 crc kubenswrapper[4816]: I0311 12:17:48.687921 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.704323 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.704683 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.704705 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.704720 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerStarted","Data":"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd"} Mar 11 12:17:49 crc kubenswrapper[4816]: I0311 12:17:49.749102 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.290262902 podStartE2EDuration="39.749083585s" podCreationTimestamp="2026-03-11 12:17:10 +0000 UTC" firstStartedPulling="2026-03-11 12:17:28.382708007 +0000 UTC m=+1134.973971974" lastFinishedPulling="2026-03-11 12:17:47.84152869 +0000 UTC m=+1154.432792657" observedRunningTime="2026-03-11 12:17:49.74789168 +0000 UTC m=+1156.339155647" watchObservedRunningTime="2026-03-11 12:17:49.749083585 +0000 UTC m=+1156.340347552" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.055735 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:17:50 crc kubenswrapper[4816]: E0311 12:17:50.056657 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb02a85f-cc39-4119-a962-4b4fd66c015d" containerName="ovn-config" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.056738 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb02a85f-cc39-4119-a962-4b4fd66c015d" containerName="ovn-config" Mar 11 12:17:50 crc kubenswrapper[4816]: E0311 12:17:50.056816 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" containerName="mariadb-account-create-update" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.056874 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" containerName="mariadb-account-create-update" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.057129 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" containerName="mariadb-account-create-update" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.057214 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb02a85f-cc39-4119-a962-4b4fd66c015d" containerName="ovn-config" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.058622 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.067372 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.084010 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.141986 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb02a85f-cc39-4119-a962-4b4fd66c015d" path="/var/lib/kubelet/pods/eb02a85f-cc39-4119-a962-4b4fd66c015d/volumes" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208234 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208310 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208351 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208374 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208394 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.208489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.310362 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.310760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.310890 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.311100 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.311211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.311476 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.312624 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.313742 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.314543 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.315129 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.315750 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.336902 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") pod \"dnsmasq-dns-67754df655-hhz24\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:50 crc kubenswrapper[4816]: I0311 12:17:50.436983 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:51 crc kubenswrapper[4816]: I0311 12:17:51.432514 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:17:51 crc kubenswrapper[4816]: W0311 12:17:51.437161 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c46d79_aa47_428c_abec_a6f94c66e9ab.slice/crio-81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2 WatchSource:0}: Error finding container 81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2: Status 404 returned error can't find the container with id 81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2 Mar 11 12:17:51 crc kubenswrapper[4816]: I0311 12:17:51.721712 4816 generic.go:334] "Generic (PLEG): container finished" podID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerID="7cfd948e58ca0b33af11396daaf98403ef86af1a5fd0724d0ce0200e144ab4fe" exitCode=0 Mar 11 12:17:51 crc kubenswrapper[4816]: I0311 12:17:51.721763 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerDied","Data":"7cfd948e58ca0b33af11396daaf98403ef86af1a5fd0724d0ce0200e144ab4fe"} Mar 11 12:17:51 crc kubenswrapper[4816]: I0311 12:17:51.721889 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerStarted","Data":"81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2"} Mar 11 12:17:52 crc kubenswrapper[4816]: I0311 12:17:52.730569 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerStarted","Data":"e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2"} Mar 11 12:17:52 crc kubenswrapper[4816]: I0311 12:17:52.731084 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:17:52 crc kubenswrapper[4816]: I0311 12:17:52.754591 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67754df655-hhz24" podStartSLOduration=2.7545667160000002 podStartE2EDuration="2.754566716s" podCreationTimestamp="2026-03-11 12:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:52.751366923 +0000 UTC m=+1159.342630890" watchObservedRunningTime="2026-03-11 12:17:52.754566716 +0000 UTC m=+1159.345830703" Mar 11 12:17:55 crc kubenswrapper[4816]: I0311 12:17:55.826479 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:17:56 crc kubenswrapper[4816]: I0311 12:17:56.309597 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 11 12:17:57 crc kubenswrapper[4816]: I0311 12:17:57.964245 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:17:57 crc kubenswrapper[4816]: I0311 12:17:57.972232 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.016194 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.049796 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.049897 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.074155 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.077792 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.145738 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.151949 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.152075 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.153003 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.166585 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.185010 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") pod \"cinder-db-create-xm9d9\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.253573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.253640 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.281178 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.282372 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.295468 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.326748 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm9d9" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.334938 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.336048 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.341170 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.343091 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8k5jj" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.343306 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.343429 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.343568 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.355963 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.356057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.357190 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.383511 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") pod \"cinder-4bcf-account-create-update-gkcsc\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.410169 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.411240 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.414780 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.429001 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.460826 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.460928 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461004 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461043 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461077 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461223 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.461416 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.476979 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.501926 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.503322 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.548765 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563808 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563846 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563871 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563895 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563930 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.563964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.565376 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.566044 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.576416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.586146 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.594628 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.596035 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.596750 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") pod \"keystone-db-sync-kbmsk\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.597270 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") pod \"neutron-963f-account-create-update-w2lrf\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.601212 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.601514 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") pod \"barbican-db-create-cnlpc\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.618410 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnlpc" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.627636 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.666500 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.666545 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.737913 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.748994 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.770760 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.771135 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.771163 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.771201 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.771980 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.788329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") pod \"neutron-db-create-4kpfn\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.874407 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.874529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.875224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.909923 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") pod \"barbican-4a8d-account-create-update-gxhxz\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.925793 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4kpfn" Mar 11 12:17:58 crc kubenswrapper[4816]: I0311 12:17:58.947166 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.037707 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.147901 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.259091 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.329266 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:17:59 crc kubenswrapper[4816]: W0311 12:17:59.478689 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e6e9e0_bfd4_4e8d_823b_9e2bfdfe6d56.slice/crio-0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e WatchSource:0}: Error finding container 0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e: Status 404 returned error can't find the container with id 0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.502612 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.629214 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:17:59 crc kubenswrapper[4816]: W0311 12:17:59.630044 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66951176_170f_4d49_9a92_aeeb66f4a79c.slice/crio-f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1 WatchSource:0}: Error finding container f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1: Status 404 returned error can't find the container with id f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1 Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.646179 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.796053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-gkcsc" event={"ID":"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16","Type":"ContainerStarted","Data":"369058cbb8fa5b6fcca641d0c6bacd8fb984decb4576950458c2fae4a2d14692"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.796102 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-gkcsc" event={"ID":"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16","Type":"ContainerStarted","Data":"e8ca11c82565ae3f35fb6628a625997e61984edf2584bf5b6f01f77d24b2ea45"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.801088 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-gxhxz" event={"ID":"27a1317c-41a6-4589-949b-e422d7fe8837","Type":"ContainerStarted","Data":"04ba277b83a2833a8a1e2667f16776d5e0146b956d56c20da2b18a82224445d9"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.819944 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4kpfn" event={"ID":"66951176-170f-4d49-9a92-aeeb66f4a79c","Type":"ContainerStarted","Data":"f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.824806 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-4bcf-account-create-update-gkcsc" podStartSLOduration=1.824786458 podStartE2EDuration="1.824786458s" podCreationTimestamp="2026-03-11 12:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:59.817676592 +0000 UTC m=+1166.408940559" watchObservedRunningTime="2026-03-11 12:17:59.824786458 +0000 UTC m=+1166.416050415" Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.826735 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm9d9" event={"ID":"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac","Type":"ContainerStarted","Data":"5d5e0febf80ed4282e61f8380eff77836a90544898e5ff129bf2d82dd15449ea"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.826762 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm9d9" event={"ID":"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac","Type":"ContainerStarted","Data":"29a2d35d034352fc106b2d5ac30a149571b3d9b03ee18c6197d6c9b89fe24636"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.831102 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnlpc" event={"ID":"a25abdc0-8516-4747-a589-78db9bc64ca3","Type":"ContainerStarted","Data":"19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.831130 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnlpc" event={"ID":"a25abdc0-8516-4747-a589-78db9bc64ca3","Type":"ContainerStarted","Data":"dbe8001a30913aacdd0f84f1edb48ad5e6ee0a7f33e24c58a948426697a06086"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.834599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-w2lrf" event={"ID":"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6","Type":"ContainerStarted","Data":"16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.834628 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-w2lrf" event={"ID":"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6","Type":"ContainerStarted","Data":"b6cfc328bc6a5ec19bbd297d5021170486b6ed13ec54ef1b30a52f9abba8f3f9"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.846764 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kbmsk" event={"ID":"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56","Type":"ContainerStarted","Data":"0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.855367 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n98v5" event={"ID":"b6745bae-b403-4a86-9148-8baecc00f8b1","Type":"ContainerStarted","Data":"e5f24ad51eefb627e15014c3582b64a13468c820d9ad9ccfa53acd2f0fb30054"} Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.877242 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-963f-account-create-update-w2lrf" podStartSLOduration=1.877208901 podStartE2EDuration="1.877208901s" podCreationTimestamp="2026-03-11 12:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:59.86993715 +0000 UTC m=+1166.461201117" watchObservedRunningTime="2026-03-11 12:17:59.877208901 +0000 UTC m=+1166.468472868" Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.931616 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-n98v5" podStartSLOduration=2.134032273 podStartE2EDuration="35.931590961s" podCreationTimestamp="2026-03-11 12:17:24 +0000 UTC" firstStartedPulling="2026-03-11 12:17:25.030210096 +0000 UTC m=+1131.621474063" lastFinishedPulling="2026-03-11 12:17:58.827768784 +0000 UTC m=+1165.419032751" observedRunningTime="2026-03-11 12:17:59.925739991 +0000 UTC m=+1166.517003958" watchObservedRunningTime="2026-03-11 12:17:59.931590961 +0000 UTC m=+1166.522854928" Mar 11 12:17:59 crc kubenswrapper[4816]: I0311 12:17:59.940639 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-cnlpc" podStartSLOduration=1.940619343 podStartE2EDuration="1.940619343s" podCreationTimestamp="2026-03-11 12:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:17:59.905854673 +0000 UTC m=+1166.497118640" watchObservedRunningTime="2026-03-11 12:17:59.940619343 +0000 UTC m=+1166.531883310" Mar 11 12:18:00 crc kubenswrapper[4816]: E0311 12:18:00.074616 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda25abdc0_8516_4747_a589_78db9bc64ca3.slice/crio-conmon-19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode82cb42a_5dbf_43d1_a71c_18b3e6d252d6.slice/crio-16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.186330 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.187338 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.187420 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.192502 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.192673 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.192569 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.316585 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") pod \"auto-csr-approver-29553858-brk44\" (UID: \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\") " pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.418615 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") pod \"auto-csr-approver-29553858-brk44\" (UID: \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\") " pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.443357 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.458007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") pod \"auto-csr-approver-29553858-brk44\" (UID: \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\") " pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.550737 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.551585 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="dnsmasq-dns" containerID="cri-o://f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7" gracePeriod=10 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.612105 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.884695 4816 generic.go:334] "Generic (PLEG): container finished" podID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" containerID="5d5e0febf80ed4282e61f8380eff77836a90544898e5ff129bf2d82dd15449ea" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.884843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm9d9" event={"ID":"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac","Type":"ContainerDied","Data":"5d5e0febf80ed4282e61f8380eff77836a90544898e5ff129bf2d82dd15449ea"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.891975 4816 generic.go:334] "Generic (PLEG): container finished" podID="a25abdc0-8516-4747-a589-78db9bc64ca3" containerID="19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.892083 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnlpc" event={"ID":"a25abdc0-8516-4747-a589-78db9bc64ca3","Type":"ContainerDied","Data":"19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.907791 4816 generic.go:334] "Generic (PLEG): container finished" podID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" containerID="16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.907894 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-w2lrf" event={"ID":"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6","Type":"ContainerDied","Data":"16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.910597 4816 generic.go:334] "Generic (PLEG): container finished" podID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerID="f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.910702 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerDied","Data":"f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.917125 4816 generic.go:334] "Generic (PLEG): container finished" podID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" containerID="369058cbb8fa5b6fcca641d0c6bacd8fb984decb4576950458c2fae4a2d14692" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.917272 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-gkcsc" event={"ID":"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16","Type":"ContainerDied","Data":"369058cbb8fa5b6fcca641d0c6bacd8fb984decb4576950458c2fae4a2d14692"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.919037 4816 generic.go:334] "Generic (PLEG): container finished" podID="27a1317c-41a6-4589-949b-e422d7fe8837" containerID="ae9a5cdf2df1a6846c30df048ff752db89454e8f6330fe73c2c82145d550960b" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.919104 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-gxhxz" event={"ID":"27a1317c-41a6-4589-949b-e422d7fe8837","Type":"ContainerDied","Data":"ae9a5cdf2df1a6846c30df048ff752db89454e8f6330fe73c2c82145d550960b"} Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.925146 4816 generic.go:334] "Generic (PLEG): container finished" podID="66951176-170f-4d49-9a92-aeeb66f4a79c" containerID="234b74962788658b9515670058c8f55bb2409a552461ddec719b37310c8f7e0d" exitCode=0 Mar 11 12:18:00 crc kubenswrapper[4816]: I0311 12:18:00.925200 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4kpfn" event={"ID":"66951176-170f-4d49-9a92-aeeb66f4a79c","Type":"ContainerDied","Data":"234b74962788658b9515670058c8f55bb2409a552461ddec719b37310c8f7e0d"} Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.082579 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.188628 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.254524 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.254730 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.254816 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.254899 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.255010 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") pod \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\" (UID: \"bcc1a78b-c3d2-4c15-81a0-0431da953e51\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.266305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm" (OuterVolumeSpecName: "kube-api-access-jwfrm") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "kube-api-access-jwfrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.320240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config" (OuterVolumeSpecName: "config") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.320731 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.325148 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.335174 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bcc1a78b-c3d2-4c15-81a0-0431da953e51" (UID: "bcc1a78b-c3d2-4c15-81a0-0431da953e51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358512 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358549 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358563 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwfrm\" (UniqueName: \"kubernetes.io/projected/bcc1a78b-c3d2-4c15-81a0-0431da953e51-kube-api-access-jwfrm\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358574 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.358583 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcc1a78b-c3d2-4c15-81a0-0431da953e51-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.402028 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm9d9" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.562019 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") pod \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.562232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") pod \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\" (UID: \"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac\") " Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.563104 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" (UID: "b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.568024 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8" (OuterVolumeSpecName: "kube-api-access-dwfp8") pod "b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" (UID: "b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac"). InnerVolumeSpecName "kube-api-access-dwfp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.664909 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.664946 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwfp8\" (UniqueName: \"kubernetes.io/projected/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac-kube-api-access-dwfp8\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.935790 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553858-brk44" event={"ID":"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193","Type":"ContainerStarted","Data":"eca476c37d977dc52f8c2a3d4b650eb7ecd907ca671efa005aedc89c638e9ac8"} Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.937544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm9d9" event={"ID":"b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac","Type":"ContainerDied","Data":"29a2d35d034352fc106b2d5ac30a149571b3d9b03ee18c6197d6c9b89fe24636"} Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.937730 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a2d35d034352fc106b2d5ac30a149571b3d9b03ee18c6197d6c9b89fe24636" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.937954 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm9d9" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.946802 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" event={"ID":"bcc1a78b-c3d2-4c15-81a0-0431da953e51","Type":"ContainerDied","Data":"ba3c97adc7cc798d326f3771649f02fd21d888d95dfd0aad666803c28f5b240b"} Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.946872 4816 scope.go:117] "RemoveContainer" containerID="f1a234613505f291637cb739619dbef7845308ac22057594b971bae3924f2dc7" Mar 11 12:18:01 crc kubenswrapper[4816]: I0311 12:18:01.947307 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-wqn2t" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.006324 4816 scope.go:117] "RemoveContainer" containerID="035e208fb3e5fc9b968f1db57d46e9bd63d57178d448cbc27d1282a58427f605" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.010353 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.021092 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-wqn2t"] Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.148064 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" path="/var/lib/kubelet/pods/bcc1a78b-c3d2-4c15-81a0-0431da953e51/volumes" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.414725 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.557305 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.569031 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnlpc" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.571406 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.582440 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") pod \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.582584 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") pod \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\" (UID: \"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.584149 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" (UID: "f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.598544 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4kpfn" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.603591 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs" (OuterVolumeSpecName: "kube-api-access-k6fcs") pod "f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" (UID: "f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16"). InnerVolumeSpecName "kube-api-access-k6fcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.683910 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") pod \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684014 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") pod \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\" (UID: \"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684086 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") pod \"27a1317c-41a6-4589-949b-e422d7fe8837\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684123 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") pod \"a25abdc0-8516-4747-a589-78db9bc64ca3\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684454 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" (UID: "e82cb42a-5dbf-43d1-a71c-18b3e6d252d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684585 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") pod \"27a1317c-41a6-4589-949b-e422d7fe8837\" (UID: \"27a1317c-41a6-4589-949b-e422d7fe8837\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.684630 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") pod \"a25abdc0-8516-4747-a589-78db9bc64ca3\" (UID: \"a25abdc0-8516-4747-a589-78db9bc64ca3\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.685084 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6fcs\" (UniqueName: \"kubernetes.io/projected/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-kube-api-access-k6fcs\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.685167 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.685175 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.685620 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a25abdc0-8516-4747-a589-78db9bc64ca3" (UID: "a25abdc0-8516-4747-a589-78db9bc64ca3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.686228 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27a1317c-41a6-4589-949b-e422d7fe8837" (UID: "27a1317c-41a6-4589-949b-e422d7fe8837"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.688356 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx" (OuterVolumeSpecName: "kube-api-access-pbxfx") pod "a25abdc0-8516-4747-a589-78db9bc64ca3" (UID: "a25abdc0-8516-4747-a589-78db9bc64ca3"). InnerVolumeSpecName "kube-api-access-pbxfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.688896 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq" (OuterVolumeSpecName: "kube-api-access-zk4zq") pod "e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" (UID: "e82cb42a-5dbf-43d1-a71c-18b3e6d252d6"). InnerVolumeSpecName "kube-api-access-zk4zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.689383 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm" (OuterVolumeSpecName: "kube-api-access-w2dkm") pod "27a1317c-41a6-4589-949b-e422d7fe8837" (UID: "27a1317c-41a6-4589-949b-e422d7fe8837"). InnerVolumeSpecName "kube-api-access-w2dkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.787754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") pod \"66951176-170f-4d49-9a92-aeeb66f4a79c\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788006 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") pod \"66951176-170f-4d49-9a92-aeeb66f4a79c\" (UID: \"66951176-170f-4d49-9a92-aeeb66f4a79c\") " Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788585 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a25abdc0-8516-4747-a589-78db9bc64ca3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788617 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2dkm\" (UniqueName: \"kubernetes.io/projected/27a1317c-41a6-4589-949b-e422d7fe8837-kube-api-access-w2dkm\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788633 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxfx\" (UniqueName: \"kubernetes.io/projected/a25abdc0-8516-4747-a589-78db9bc64ca3-kube-api-access-pbxfx\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788647 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk4zq\" (UniqueName: \"kubernetes.io/projected/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6-kube-api-access-zk4zq\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788661 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27a1317c-41a6-4589-949b-e422d7fe8837-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.788648 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66951176-170f-4d49-9a92-aeeb66f4a79c" (UID: "66951176-170f-4d49-9a92-aeeb66f4a79c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.792367 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6" (OuterVolumeSpecName: "kube-api-access-zm2m6") pod "66951176-170f-4d49-9a92-aeeb66f4a79c" (UID: "66951176-170f-4d49-9a92-aeeb66f4a79c"). InnerVolumeSpecName "kube-api-access-zm2m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.890329 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm2m6\" (UniqueName: \"kubernetes.io/projected/66951176-170f-4d49-9a92-aeeb66f4a79c-kube-api-access-zm2m6\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.890379 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66951176-170f-4d49-9a92-aeeb66f4a79c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.959092 4816 generic.go:334] "Generic (PLEG): container finished" podID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" containerID="0d27f73615e32aa404576eea9593c729502e37fe26b5c92717c4bee0b43a98e6" exitCode=0 Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.959906 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553858-brk44" event={"ID":"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193","Type":"ContainerDied","Data":"0d27f73615e32aa404576eea9593c729502e37fe26b5c92717c4bee0b43a98e6"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.967572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-gkcsc" event={"ID":"f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16","Type":"ContainerDied","Data":"e8ca11c82565ae3f35fb6628a625997e61984edf2584bf5b6f01f77d24b2ea45"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.967617 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8ca11c82565ae3f35fb6628a625997e61984edf2584bf5b6f01f77d24b2ea45" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.967686 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-gkcsc" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.981620 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-gxhxz" event={"ID":"27a1317c-41a6-4589-949b-e422d7fe8837","Type":"ContainerDied","Data":"04ba277b83a2833a8a1e2667f16776d5e0146b956d56c20da2b18a82224445d9"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.981668 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04ba277b83a2833a8a1e2667f16776d5e0146b956d56c20da2b18a82224445d9" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.981733 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-gxhxz" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.985905 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4kpfn" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.987052 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4kpfn" event={"ID":"66951176-170f-4d49-9a92-aeeb66f4a79c","Type":"ContainerDied","Data":"f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.987141 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1a99dc130938800f8f6978214fdd8b7e688157675e3e3050193a9527e3f2bb1" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.989214 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnlpc" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.989232 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnlpc" event={"ID":"a25abdc0-8516-4747-a589-78db9bc64ca3","Type":"ContainerDied","Data":"dbe8001a30913aacdd0f84f1edb48ad5e6ee0a7f33e24c58a948426697a06086"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.989455 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe8001a30913aacdd0f84f1edb48ad5e6ee0a7f33e24c58a948426697a06086" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.992319 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-w2lrf" event={"ID":"e82cb42a-5dbf-43d1-a71c-18b3e6d252d6","Type":"ContainerDied","Data":"b6cfc328bc6a5ec19bbd297d5021170486b6ed13ec54ef1b30a52f9abba8f3f9"} Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.992358 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6cfc328bc6a5ec19bbd297d5021170486b6ed13ec54ef1b30a52f9abba8f3f9" Mar 11 12:18:02 crc kubenswrapper[4816]: I0311 12:18:02.992359 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-w2lrf" Mar 11 12:18:05 crc kubenswrapper[4816]: I0311 12:18:05.885514 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.033584 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553858-brk44" event={"ID":"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193","Type":"ContainerDied","Data":"eca476c37d977dc52f8c2a3d4b650eb7ecd907ca671efa005aedc89c638e9ac8"} Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.034217 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca476c37d977dc52f8c2a3d4b650eb7ecd907ca671efa005aedc89c638e9ac8" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.033879 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553858-brk44" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.059849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") pod \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\" (UID: \"99d8cc8e-8af3-41b3-bb8c-6e4e10f00193\") " Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.065900 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf" (OuterVolumeSpecName: "kube-api-access-q6ppf") pod "99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" (UID: "99d8cc8e-8af3-41b3-bb8c-6e4e10f00193"). InnerVolumeSpecName "kube-api-access-q6ppf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.161804 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6ppf\" (UniqueName: \"kubernetes.io/projected/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193-kube-api-access-q6ppf\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:06 crc kubenswrapper[4816]: I0311 12:18:06.991890 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.023667 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553852-tvtrs"] Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.075045 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kbmsk" event={"ID":"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56","Type":"ContainerStarted","Data":"5f7cdb31826f59ca1238a145635210cb534eb8b83a42327083142e83ef21c961"} Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.077354 4816 generic.go:334] "Generic (PLEG): container finished" podID="b6745bae-b403-4a86-9148-8baecc00f8b1" containerID="e5f24ad51eefb627e15014c3582b64a13468c820d9ad9ccfa53acd2f0fb30054" exitCode=0 Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.077410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n98v5" event={"ID":"b6745bae-b403-4a86-9148-8baecc00f8b1","Type":"ContainerDied","Data":"e5f24ad51eefb627e15014c3582b64a13468c820d9ad9ccfa53acd2f0fb30054"} Mar 11 12:18:07 crc kubenswrapper[4816]: I0311 12:18:07.091777 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kbmsk" podStartSLOduration=2.654693612 podStartE2EDuration="9.091752753s" podCreationTimestamp="2026-03-11 12:17:58 +0000 UTC" firstStartedPulling="2026-03-11 12:17:59.482268908 +0000 UTC m=+1166.073532875" lastFinishedPulling="2026-03-11 12:18:05.919328049 +0000 UTC m=+1172.510592016" observedRunningTime="2026-03-11 12:18:07.090139976 +0000 UTC m=+1173.681403943" watchObservedRunningTime="2026-03-11 12:18:07.091752753 +0000 UTC m=+1173.683016720" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.145767 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d15245-e206-4f60-a05c-9888a45a1aca" path="/var/lib/kubelet/pods/f1d15245-e206-4f60-a05c-9888a45a1aca/volumes" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.519349 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n98v5" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.712477 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") pod \"b6745bae-b403-4a86-9148-8baecc00f8b1\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.712555 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") pod \"b6745bae-b403-4a86-9148-8baecc00f8b1\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.712709 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") pod \"b6745bae-b403-4a86-9148-8baecc00f8b1\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.712810 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") pod \"b6745bae-b403-4a86-9148-8baecc00f8b1\" (UID: \"b6745bae-b403-4a86-9148-8baecc00f8b1\") " Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.719520 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b6745bae-b403-4a86-9148-8baecc00f8b1" (UID: "b6745bae-b403-4a86-9148-8baecc00f8b1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.719607 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx" (OuterVolumeSpecName: "kube-api-access-pghjx") pod "b6745bae-b403-4a86-9148-8baecc00f8b1" (UID: "b6745bae-b403-4a86-9148-8baecc00f8b1"). InnerVolumeSpecName "kube-api-access-pghjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.741456 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6745bae-b403-4a86-9148-8baecc00f8b1" (UID: "b6745bae-b403-4a86-9148-8baecc00f8b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.762782 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data" (OuterVolumeSpecName: "config-data") pod "b6745bae-b403-4a86-9148-8baecc00f8b1" (UID: "b6745bae-b403-4a86-9148-8baecc00f8b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.815955 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pghjx\" (UniqueName: \"kubernetes.io/projected/b6745bae-b403-4a86-9148-8baecc00f8b1-kube-api-access-pghjx\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.816041 4816 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.816077 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:08 crc kubenswrapper[4816]: I0311 12:18:08.816104 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6745bae-b403-4a86-9148-8baecc00f8b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.097773 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n98v5" event={"ID":"b6745bae-b403-4a86-9148-8baecc00f8b1","Type":"ContainerDied","Data":"7ade065f6f708de586323f677e56810ada0b99da337e5a079b57da2cc0b0b5a0"} Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.098210 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ade065f6f708de586323f677e56810ada0b99da337e5a079b57da2cc0b0b5a0" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.098299 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n98v5" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.483040 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.484702 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="dnsmasq-dns" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.484798 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="dnsmasq-dns" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.484869 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" containerName="glance-db-sync" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.484938 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" containerName="glance-db-sync" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.485051 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66951176-170f-4d49-9a92-aeeb66f4a79c" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.485117 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="66951176-170f-4d49-9a92-aeeb66f4a79c" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.485185 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" containerName="oc" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.485273 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" containerName="oc" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.485358 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25abdc0-8516-4747-a589-78db9bc64ca3" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.485464 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25abdc0-8516-4747-a589-78db9bc64ca3" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.485574 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a1317c-41a6-4589-949b-e422d7fe8837" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.485651 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a1317c-41a6-4589-949b-e422d7fe8837" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.487225 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.487327 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.487389 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="init" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.487446 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="init" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.487506 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.487568 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: E0311 12:18:09.487624 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.487677 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488021 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488101 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a1317c-41a6-4589-949b-e422d7fe8837" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488168 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc1a78b-c3d2-4c15-81a0-0431da953e51" containerName="dnsmasq-dns" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488277 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488345 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" containerName="oc" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488421 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" containerName="glance-db-sync" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488487 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" containerName="mariadb-account-create-update" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488554 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="66951176-170f-4d49-9a92-aeeb66f4a79c" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.488613 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25abdc0-8516-4747-a589-78db9bc64ca3" containerName="mariadb-database-create" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.489738 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.500714 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.515107 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.515161 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.630729 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.630813 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.631133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.631259 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.631433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.631574 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733649 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733730 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733781 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733819 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.733880 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.734771 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.734805 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.735010 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.735135 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.735308 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.766975 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") pod \"dnsmasq-dns-6f88567fd9-qp995\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:09 crc kubenswrapper[4816]: I0311 12:18:09.815036 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:10 crc kubenswrapper[4816]: I0311 12:18:10.116049 4816 generic.go:334] "Generic (PLEG): container finished" podID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" containerID="5f7cdb31826f59ca1238a145635210cb534eb8b83a42327083142e83ef21c961" exitCode=0 Mar 11 12:18:10 crc kubenswrapper[4816]: I0311 12:18:10.116109 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kbmsk" event={"ID":"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56","Type":"ContainerDied","Data":"5f7cdb31826f59ca1238a145635210cb534eb8b83a42327083142e83ef21c961"} Mar 11 12:18:10 crc kubenswrapper[4816]: I0311 12:18:10.299718 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.126127 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9624e97-8103-4296-b562-982cf05abfec" containerID="d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85" exitCode=0 Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.126267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerDied","Data":"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85"} Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.126663 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerStarted","Data":"1722f447074997662412f081f41e66350a45168b3ef01991c199f5c589a81402"} Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.447562 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.574832 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") pod \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.575092 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") pod \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.575138 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") pod \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\" (UID: \"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56\") " Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.580607 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m" (OuterVolumeSpecName: "kube-api-access-x9c6m") pod "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" (UID: "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56"). InnerVolumeSpecName "kube-api-access-x9c6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.606439 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" (UID: "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.621264 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data" (OuterVolumeSpecName: "config-data") pod "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" (UID: "45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.677473 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.677517 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9c6m\" (UniqueName: \"kubernetes.io/projected/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-kube-api-access-x9c6m\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:11 crc kubenswrapper[4816]: I0311 12:18:11.677529 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.138920 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kbmsk" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.141113 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kbmsk" event={"ID":"45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56","Type":"ContainerDied","Data":"0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e"} Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.141173 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f22b75107e8cd42721b969d6ddcf352087537aee228dbbf09d44200f875243e" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.141303 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerStarted","Data":"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f"} Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.142665 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.176032 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" podStartSLOduration=3.176002976 podStartE2EDuration="3.176002976s" podCreationTimestamp="2026-03-11 12:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:12.167197793 +0000 UTC m=+1178.758461780" watchObservedRunningTime="2026-03-11 12:18:12.176002976 +0000 UTC m=+1178.767266953" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.423986 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.455289 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:12 crc kubenswrapper[4816]: E0311 12:18:12.455709 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" containerName="keystone-db-sync" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.455730 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" containerName="keystone-db-sync" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.455958 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" containerName="keystone-db-sync" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.456653 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.461035 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.461229 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.461407 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.462600 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8k5jj" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.464130 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.489520 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.525315 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.526930 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.597855 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.597915 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.597950 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.598004 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.598042 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.598108 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.598456 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.700591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.700687 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.700785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.700841 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701506 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701543 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701568 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701646 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701680 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701705 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701728 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.701753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.709697 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.710473 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.712790 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.713376 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.714935 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.717081 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.718287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.720835 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.721179 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.721189 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6qw4t" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.726758 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.774975 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") pod \"keystone-bootstrap-fvdtl\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802688 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802744 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802775 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802809 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802833 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802853 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802873 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802902 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802965 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.802991 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.803008 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.804145 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.805655 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.806200 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.807803 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.823839 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.846156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") pod \"dnsmasq-dns-5cb4dcfdd7-frz8f\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.856561 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.859054 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.883725 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.884027 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.893539 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.894179 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.895756 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.904888 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.904957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.905036 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.905106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.905172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.905211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.924883 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.929159 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.955935 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.956319 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.956583 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-48x47" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.960028 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.980046 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:12 crc kubenswrapper[4816]: I0311 12:18:12.985787 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007504 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007556 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007584 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007633 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007655 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007768 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007807 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.007849 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.008204 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.008332 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.025424 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.038692 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") pod \"cinder-db-sync-fjmnw\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.060636 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.078103 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.087127 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.106076 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.119445 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.119965 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fxmtd" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.130484 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.148035 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.151558 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.171887 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.171964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172054 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172271 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172445 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.172476 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.180369 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.151553 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.182137 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.182588 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.194700 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.195872 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.223639 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.224323 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.237467 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.241389 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") pod \"ceilometer-0\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.242330 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") pod \"neutron-db-sync-tdv64\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.269335 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.274602 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.274676 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.274710 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.309363 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.310667 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.327658 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.328333 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l2nzr" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.328485 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.353069 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.369571 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.376898 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.376958 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.376988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377065 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377092 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377109 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.377155 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.392525 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.401877 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.406461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.424739 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.424892 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.444103 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.459345 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.471423 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") pod \"barbican-db-sync-rjxsf\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479631 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479750 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479823 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479849 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.479873 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.486905 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.487434 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.495706 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.497705 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.508809 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.520195 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") pod \"placement-db-sync-4b4ms\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.524416 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4b4ms" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596459 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596536 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596557 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596592 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596672 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.596702 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.693431 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.697406 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.698211 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.700114 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704345 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704437 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704688 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704718 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.704847 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.706458 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.706746 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.707473 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.708057 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.710461 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.711002 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.711428 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-22dm7" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.711556 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.755261 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.759266 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.760155 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") pod \"dnsmasq-dns-759cc7f497-2nfvt\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.763205 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.785677 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.808804 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809464 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809537 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809619 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809650 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809675 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.809702 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.862639 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911149 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911268 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911423 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911445 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911471 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911497 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911558 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911586 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911618 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.911656 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.912215 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.912878 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.913199 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.925022 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.925736 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.944966 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:13 crc kubenswrapper[4816]: I0311 12:18:13.945109 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.016312 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.018076 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.018160 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.018219 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.018934 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.019070 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.034626 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.034763 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.034961 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.035196 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.036310 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.044660 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.049384 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.049921 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.056692 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.057910 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") pod \"glance-default-internal-api-0\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.061374 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.062205 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.084815 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.170843 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.259312 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" event={"ID":"0177cd91-bf8f-4e82-9f8b-5c50118dee09","Type":"ContainerStarted","Data":"93628c673521ca0506bed73eea0c7faf2d887ad9a1a290cd554669daa35c3ce3"} Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.259280 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="dnsmasq-dns" containerID="cri-o://ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" gracePeriod=10 Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.365759 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.412511 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:18:14 crc kubenswrapper[4816]: W0311 12:18:14.448707 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2772ef82_fe14_4f4d_8349_8ee515e39979.slice/crio-f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422 WatchSource:0}: Error finding container f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422: Status 404 returned error can't find the container with id f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422 Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.816332 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.826106 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:18:14 crc kubenswrapper[4816]: I0311 12:18:14.842992 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:18:14 crc kubenswrapper[4816]: W0311 12:18:14.886529 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc643aa04_ce8d_4c3b_befc_ecdf63e35de8.slice/crio-1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2 WatchSource:0}: Error finding container 1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2: Status 404 returned error can't find the container with id 1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2 Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.023073 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094316 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094432 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094491 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094609 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094636 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.094661 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") pod \"a9624e97-8103-4296-b562-982cf05abfec\" (UID: \"a9624e97-8103-4296-b562-982cf05abfec\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.132808 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x" (OuterVolumeSpecName: "kube-api-access-zh48x") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "kube-api-access-zh48x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.198474 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh48x\" (UniqueName: \"kubernetes.io/projected/a9624e97-8103-4296-b562-982cf05abfec-kube-api-access-zh48x\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.225070 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.292168 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fvdtl" event={"ID":"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85","Type":"ContainerStarted","Data":"0800101b998bc39fbc15280d1397d1d43d3447b5f82870b95f5c0e69e60ff601"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.292275 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fvdtl" event={"ID":"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85","Type":"ContainerStarted","Data":"6cce3dedf2f34ff6f1ab7cb71e5194415d6238bd65c1baa92383feb742726164"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.295200 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"8ef644ece8e49d46c6b3d18fe5c5f96913f607e9b6a202c08e5f7ee442c27c93"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.304448 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config" (OuterVolumeSpecName: "config") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.306735 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4b4ms" event={"ID":"f92c8acc-1a4a-4f28-a123-2f5b8b6905af","Type":"ContainerStarted","Data":"31e496272578b057f389702c22da6db4b04713d9b39444d9f2071398a63be537"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325849 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9624e97-8103-4296-b562-982cf05abfec" containerID="ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" exitCode=0 Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325894 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325903 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerDied","Data":"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f88567fd9-qp995" event={"ID":"a9624e97-8103-4296-b562-982cf05abfec","Type":"ContainerDied","Data":"1722f447074997662412f081f41e66350a45168b3ef01991c199f5c589a81402"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.325993 4816 scope.go:117] "RemoveContainer" containerID="ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.327404 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rjxsf" event={"ID":"c643aa04-ce8d-4c3b-befc-ecdf63e35de8","Type":"ContainerStarted","Data":"1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.336542 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fvdtl" podStartSLOduration=3.336508962 podStartE2EDuration="3.336508962s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:15.318321568 +0000 UTC m=+1181.909585535" watchObservedRunningTime="2026-03-11 12:18:15.336508962 +0000 UTC m=+1181.927772929" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.336836 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerStarted","Data":"7732a86e8dc12bafbe8cdaac586dd615d3b76e080ef096246aeda54dd0e49383"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.339226 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tdv64" event={"ID":"3ae20611-891b-49ee-b5b8-0dad8af80906","Type":"ContainerStarted","Data":"407febd9600a7f2ac248ee17af289c238ad46396cc3e57692031c09ed1d62622"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.340910 4816 generic.go:334] "Generic (PLEG): container finished" podID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" containerID="2e5a3cf87af6703aca115bdf04591ed2e43c5e703b0abce69ed3bd4c9e44028a" exitCode=0 Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.341554 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" event={"ID":"0177cd91-bf8f-4e82-9f8b-5c50118dee09","Type":"ContainerDied","Data":"2e5a3cf87af6703aca115bdf04591ed2e43c5e703b0abce69ed3bd4c9e44028a"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.348926 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.355217 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjmnw" event={"ID":"2772ef82-fe14-4f4d-8349-8ee515e39979","Type":"ContainerStarted","Data":"f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422"} Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.401802 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.438717 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.439807 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.470861 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.480977 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a9624e97-8103-4296-b562-982cf05abfec" (UID: "a9624e97-8103-4296-b562-982cf05abfec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.503701 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.503756 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.503768 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.503777 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9624e97-8103-4296-b562-982cf05abfec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.515269 4816 scope.go:117] "RemoveContainer" containerID="d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.583760 4816 scope.go:117] "RemoveContainer" containerID="ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" Mar 11 12:18:15 crc kubenswrapper[4816]: E0311 12:18:15.586511 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f\": container with ID starting with ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f not found: ID does not exist" containerID="ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.586573 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f"} err="failed to get container status \"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f\": rpc error: code = NotFound desc = could not find container \"ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f\": container with ID starting with ee8d2f2b925c615ace182c5fd5da3ff6a336e3776cbbadfd70dee5d93ca1e16f not found: ID does not exist" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.586601 4816 scope.go:117] "RemoveContainer" containerID="d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85" Mar 11 12:18:15 crc kubenswrapper[4816]: E0311 12:18:15.599750 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85\": container with ID starting with d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85 not found: ID does not exist" containerID="d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.599803 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85"} err="failed to get container status \"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85\": rpc error: code = NotFound desc = could not find container \"d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85\": container with ID starting with d502adb85c9649e0c73366c0ab0b1ca37267e7cf7c2700765c2cc8a052a71f85 not found: ID does not exist" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.687268 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.696611 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f88567fd9-qp995"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.778193 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.781380 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920420 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920547 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920653 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920726 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:15 crc kubenswrapper[4816]: I0311 12:18:15.920840 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") pod \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\" (UID: \"0177cd91-bf8f-4e82-9f8b-5c50118dee09\") " Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:15.993961 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv" (OuterVolumeSpecName: "kube-api-access-vtjdv") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "kube-api-access-vtjdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.039335 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.039971 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtjdv\" (UniqueName: \"kubernetes.io/projected/0177cd91-bf8f-4e82-9f8b-5c50118dee09-kube-api-access-vtjdv\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.040008 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.063291 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.070832 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.120588 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.126308 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config" (OuterVolumeSpecName: "config") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.143489 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.143805 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.144737 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.158058 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0177cd91-bf8f-4e82-9f8b-5c50118dee09" (UID: "0177cd91-bf8f-4e82-9f8b-5c50118dee09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.248267 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0177cd91-bf8f-4e82-9f8b-5c50118dee09-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.271316 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9624e97-8103-4296-b562-982cf05abfec" path="/var/lib/kubelet/pods/a9624e97-8103-4296-b562-982cf05abfec/volumes" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.272288 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.272328 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.415640 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad047cd1-309a-401e-9fc6-cb1349614136" containerID="f842cab6fbb753d4036f93abfc735f41fd91ab93fdcba8e10b330c30d7aa8346" exitCode=0 Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.415748 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerDied","Data":"f842cab6fbb753d4036f93abfc735f41fd91ab93fdcba8e10b330c30d7aa8346"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.419199 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerStarted","Data":"46c804665ae23bbf6170282e95342e08c9ca8c59a8646dc8a3bb729ae4357ed1"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.429018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" event={"ID":"0177cd91-bf8f-4e82-9f8b-5c50118dee09","Type":"ContainerDied","Data":"93628c673521ca0506bed73eea0c7faf2d887ad9a1a290cd554669daa35c3ce3"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.429074 4816 scope.go:117] "RemoveContainer" containerID="2e5a3cf87af6703aca115bdf04591ed2e43c5e703b0abce69ed3bd4c9e44028a" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.429235 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-frz8f" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.451824 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tdv64" event={"ID":"3ae20611-891b-49ee-b5b8-0dad8af80906","Type":"ContainerStarted","Data":"315146a94731475a01dc83fd91cffc1dc07e3b8364e3b5f9f4c74f1dffcbe0c4"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.456414 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerStarted","Data":"e01256e648c1540249f5558a07356e2451e9fbb9af609837778e77bf7b2923ca"} Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.555709 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.606932 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-tdv64" podStartSLOduration=4.606914278 podStartE2EDuration="4.606914278s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:16.54969433 +0000 UTC m=+1183.140958297" watchObservedRunningTime="2026-03-11 12:18:16.606914278 +0000 UTC m=+1183.198178245" Mar 11 12:18:16 crc kubenswrapper[4816]: I0311 12:18:16.607315 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-frz8f"] Mar 11 12:18:17 crc kubenswrapper[4816]: I0311 12:18:17.513522 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerStarted","Data":"ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c"} Mar 11 12:18:17 crc kubenswrapper[4816]: I0311 12:18:17.514031 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:17 crc kubenswrapper[4816]: I0311 12:18:17.522230 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerStarted","Data":"c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496"} Mar 11 12:18:17 crc kubenswrapper[4816]: I0311 12:18:17.551831 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" podStartSLOduration=4.551794738 podStartE2EDuration="4.551794738s" podCreationTimestamp="2026-03-11 12:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:17.543000085 +0000 UTC m=+1184.134264052" watchObservedRunningTime="2026-03-11 12:18:17.551794738 +0000 UTC m=+1184.143058715" Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.147800 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" path="/var/lib/kubelet/pods/0177cd91-bf8f-4e82-9f8b-5c50118dee09/volumes" Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.594526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerStarted","Data":"e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d"} Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.594741 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-log" containerID="cri-o://c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496" gracePeriod=30 Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.595183 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-httpd" containerID="cri-o://e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d" gracePeriod=30 Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.620582 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerStarted","Data":"5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d"} Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.620807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerStarted","Data":"391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee"} Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.621073 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-log" containerID="cri-o://391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee" gracePeriod=30 Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.621345 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-httpd" containerID="cri-o://5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d" gracePeriod=30 Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.638636 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.6385993150000004 podStartE2EDuration="6.638599315s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:18.631967044 +0000 UTC m=+1185.223231011" watchObservedRunningTime="2026-03-11 12:18:18.638599315 +0000 UTC m=+1185.229863282" Mar 11 12:18:18 crc kubenswrapper[4816]: I0311 12:18:18.681958 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.681923523 podStartE2EDuration="6.681923523s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:18.660724792 +0000 UTC m=+1185.251988759" watchObservedRunningTime="2026-03-11 12:18:18.681923523 +0000 UTC m=+1185.273187490" Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.657276 4816 generic.go:334] "Generic (PLEG): container finished" podID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerID="5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d" exitCode=143 Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.657629 4816 generic.go:334] "Generic (PLEG): container finished" podID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerID="391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee" exitCode=143 Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.657312 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerDied","Data":"5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d"} Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.657764 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerDied","Data":"391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee"} Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.662627 4816 generic.go:334] "Generic (PLEG): container finished" podID="d7206357-ec52-4320-b659-a027694a74a9" containerID="e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d" exitCode=0 Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.662671 4816 generic.go:334] "Generic (PLEG): container finished" podID="d7206357-ec52-4320-b659-a027694a74a9" containerID="c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496" exitCode=143 Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.662696 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerDied","Data":"e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d"} Mar 11 12:18:19 crc kubenswrapper[4816]: I0311 12:18:19.662722 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerDied","Data":"c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496"} Mar 11 12:18:20 crc kubenswrapper[4816]: I0311 12:18:20.677108 4816 generic.go:334] "Generic (PLEG): container finished" podID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" containerID="0800101b998bc39fbc15280d1397d1d43d3447b5f82870b95f5c0e69e60ff601" exitCode=0 Mar 11 12:18:20 crc kubenswrapper[4816]: I0311 12:18:20.677188 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fvdtl" event={"ID":"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85","Type":"ContainerDied","Data":"0800101b998bc39fbc15280d1397d1d43d3447b5f82870b95f5c0e69e60ff601"} Mar 11 12:18:23 crc kubenswrapper[4816]: I0311 12:18:23.866528 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:18:23 crc kubenswrapper[4816]: I0311 12:18:23.955428 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:18:23 crc kubenswrapper[4816]: I0311 12:18:23.955743 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" containerID="cri-o://e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2" gracePeriod=10 Mar 11 12:18:25 crc kubenswrapper[4816]: I0311 12:18:25.101682 4816 generic.go:334] "Generic (PLEG): container finished" podID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerID="e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2" exitCode=0 Mar 11 12:18:25 crc kubenswrapper[4816]: I0311 12:18:25.101765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerDied","Data":"e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2"} Mar 11 12:18:25 crc kubenswrapper[4816]: I0311 12:18:25.438958 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Mar 11 12:18:35 crc kubenswrapper[4816]: I0311 12:18:35.665020 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.452734 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.453567 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5n8zp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fjmnw_openstack(2772ef82-fe14-4f4d-8349-8ee515e39979): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.455282 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fjmnw" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.515156 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.515235 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.521398 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.658912 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.659871 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660171 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660382 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660559 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660774 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.660960 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") pod \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\" (UID: \"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1\") " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.661054 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs" (OuterVolumeSpecName: "logs") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.661323 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.662591 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.662625 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.671701 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.672231 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts" (OuterVolumeSpecName: "scripts") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.680757 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm" (OuterVolumeSpecName: "kube-api-access-hp2hm") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "kube-api-access-hp2hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.701804 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.715281 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data" (OuterVolumeSpecName: "config-data") pod "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" (UID: "68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.723205 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1","Type":"ContainerDied","Data":"e01256e648c1540249f5558a07356e2451e9fbb9af609837778e77bf7b2923ca"} Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.723332 4816 scope.go:117] "RemoveContainer" containerID="5bd79fbed29e67c023a52beb4ae91ed0ef93cd3f9f71c55834cba55ce617ed8d" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.723230 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.726864 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-fjmnw" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.764980 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.765022 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.765034 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp2hm\" (UniqueName: \"kubernetes.io/projected/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-kube-api-access-hp2hm\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.765043 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.765071 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.789924 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.804459 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.819585 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.851939 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852562 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-log" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852586 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-log" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852624 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852632 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852651 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852660 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852689 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="dnsmasq-dns" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852700 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="dnsmasq-dns" Mar 11 12:18:39 crc kubenswrapper[4816]: E0311 12:18:39.852717 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-httpd" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852729 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-httpd" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.852978 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-httpd" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.853001 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0177cd91-bf8f-4e82-9f8b-5c50118dee09" containerName="init" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.853025 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" containerName="glance-log" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.853039 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9624e97-8103-4296-b562-982cf05abfec" containerName="dnsmasq-dns" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.854471 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.866240 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.867639 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.869046 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.870845 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973410 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973772 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973819 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973865 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.973926 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.974003 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.974036 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:39 crc kubenswrapper[4816]: I0311 12:18:39.974072 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076560 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076620 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076654 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076704 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076861 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076927 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.076978 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.077594 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.077805 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.078056 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.083508 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.096627 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.096982 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.099373 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.105177 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.116288 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.162024 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1" path="/var/lib/kubelet/pods/68ee2eb5-3e64-40bf-86c0-f1e56e57b8d1/volumes" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.194269 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.666523 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 11 12:18:40 crc kubenswrapper[4816]: I0311 12:18:40.666886 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:18:41 crc kubenswrapper[4816]: E0311 12:18:41.447947 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:b8a5d052890fb9cefa333baf10b607add227ed5d79aa108b576a97b21e89327a" Mar 11 12:18:41 crc kubenswrapper[4816]: E0311 12:18:41.448202 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:b8a5d052890fb9cefa333baf10b607add227ed5d79aa108b576a97b21e89327a,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6lqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-4b4ms_openstack(f92c8acc-1a4a-4f28-a123-2f5b8b6905af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:18:41 crc kubenswrapper[4816]: E0311 12:18:41.450096 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-4b4ms" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.522623 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.607770 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.607899 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.607963 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.608055 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.608153 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.608176 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") pod \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\" (UID: \"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85\") " Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.615350 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts" (OuterVolumeSpecName: "scripts") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.616264 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.616664 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c" (OuterVolumeSpecName: "kube-api-access-44z2c") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "kube-api-access-44z2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.635813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.641073 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.645576 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data" (OuterVolumeSpecName: "config-data") pod "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" (UID: "e391eaa0-5fb5-4ab1-a9a5-b480703f8b85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710518 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710555 4816 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710565 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710607 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710618 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44z2c\" (UniqueName: \"kubernetes.io/projected/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-kube-api-access-44z2c\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.710628 4816 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.747226 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fvdtl" Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.747226 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fvdtl" event={"ID":"e391eaa0-5fb5-4ab1-a9a5-b480703f8b85","Type":"ContainerDied","Data":"6cce3dedf2f34ff6f1ab7cb71e5194415d6238bd65c1baa92383feb742726164"} Mar 11 12:18:41 crc kubenswrapper[4816]: I0311 12:18:41.747300 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cce3dedf2f34ff6f1ab7cb71e5194415d6238bd65c1baa92383feb742726164" Mar 11 12:18:41 crc kubenswrapper[4816]: E0311 12:18:41.750487 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:b8a5d052890fb9cefa333baf10b607add227ed5d79aa108b576a97b21e89327a\\\"\"" pod="openstack/placement-db-sync-4b4ms" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.476756 4816 scope.go:117] "RemoveContainer" containerID="391e53767cf17b2f58d4934d1239e1639cc3728d74480149f45e20668d71d7ee" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.499443 4816 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.499600 4816 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zq67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-rjxsf_openstack(c643aa04-ce8d-4c3b-befc-ecdf63e35de8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.501376 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-rjxsf" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.662887 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.682710 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.732914 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.736765 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.736918 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737119 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737188 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737358 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737408 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.737475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") pod \"d7206357-ec52-4320-b659-a027694a74a9\" (UID: \"d7206357-ec52-4320-b659-a027694a74a9\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.739930 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs" (OuterVolumeSpecName: "logs") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.743839 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fvdtl"] Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.745562 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.755253 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts" (OuterVolumeSpecName: "scripts") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.755524 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.779936 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c" (OuterVolumeSpecName: "kube-api-access-cfc2c") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "kube-api-access-cfc2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.780620 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.828955 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7206357-ec52-4320-b659-a027694a74a9","Type":"ContainerDied","Data":"46c804665ae23bbf6170282e95342e08c9ca8c59a8646dc8a3bb729ae4357ed1"} Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.829013 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.829062 4816 scope.go:117] "RemoveContainer" containerID="e5a70320bc6c70cd367bbbc4bdaabb37281153f62ac1cd7c0287d723080fe33d" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843320 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843488 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843536 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843606 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.843735 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") pod \"79c46d79-aa47-428c-abec-a6f94c66e9ab\" (UID: \"79c46d79-aa47-428c-abec-a6f94c66e9ab\") " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844144 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844162 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844171 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfc2c\" (UniqueName: \"kubernetes.io/projected/d7206357-ec52-4320-b659-a027694a74a9-kube-api-access-cfc2c\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844197 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844207 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7206357-ec52-4320-b659-a027694a74a9-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.844216 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.845933 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846674 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="init" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846698 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="init" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846713 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846721 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846738 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-httpd" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846745 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-httpd" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846754 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" containerName="keystone-bootstrap" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846762 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" containerName="keystone-bootstrap" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.846777 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-log" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.846784 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-log" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.847451 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-log" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.847477 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7206357-ec52-4320-b659-a027694a74a9" containerName="glance-httpd" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.847534 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.847554 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" containerName="keystone-bootstrap" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.848425 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.848695 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-hhz24" event={"ID":"79c46d79-aa47-428c-abec-a6f94c66e9ab","Type":"ContainerDied","Data":"81fe766129ca8c4e2e7c56bfe354248ee38599d3499418773992e5736bf81df2"} Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.848717 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-hhz24" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.849201 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz" (OuterVolumeSpecName: "kube-api-access-g9csz") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "kube-api-access-g9csz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.852282 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.853921 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.854035 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8k5jj" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.854061 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.854390 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.856797 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.868035 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.871382 4816 scope.go:117] "RemoveContainer" containerID="c0c07679dd61d87d3bdfa74e26aa8259f34fe50f83fa1b2abc4b014796093496" Mar 11 12:18:42 crc kubenswrapper[4816]: E0311 12:18:42.871522 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a\\\"\"" pod="openstack/barbican-db-sync-rjxsf" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.884146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data" (OuterVolumeSpecName: "config-data") pod "d7206357-ec52-4320-b659-a027694a74a9" (UID: "d7206357-ec52-4320-b659-a027694a74a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.911931 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.915045 4816 scope.go:117] "RemoveContainer" containerID="e929e8e02a375fef6457bbafa642c02c68d821bc19103c5cffa50d761cc569e2" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.925979 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.926788 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.932792 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config" (OuterVolumeSpecName: "config") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.941985 4816 scope.go:117] "RemoveContainer" containerID="7cfd948e58ca0b33af11396daaf98403ef86af1a5fd0724d0ce0200e144ab4fe" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.946039 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79c46d79-aa47-428c-abec-a6f94c66e9ab" (UID: "79c46d79-aa47-428c-abec-a6f94c66e9ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.946662 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.946737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.946930 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.947136 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.947568 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.947722 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.947975 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948025 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948040 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9csz\" (UniqueName: \"kubernetes.io/projected/79c46d79-aa47-428c-abec-a6f94c66e9ab-kube-api-access-g9csz\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948053 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948065 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948077 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7206357-ec52-4320-b659-a027694a74a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948107 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:42 crc kubenswrapper[4816]: I0311 12:18:42.948118 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79c46d79-aa47-428c-abec-a6f94c66e9ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050000 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050089 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050238 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050302 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.050347 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.054754 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.056411 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.057050 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.057064 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.057668 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.073827 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") pod \"keystone-bootstrap-w8rqc\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.090549 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: W0311 12:18:43.095230 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9d3606c_b28d_4028_93fc_535afa127cd6.slice/crio-ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed WatchSource:0}: Error finding container ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed: Status 404 returned error can't find the container with id ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.182068 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.206901 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.268605 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.288037 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.290839 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.293167 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.296149 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.311097 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.334392 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.342727 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67754df655-hhz24"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.399663 4816 scope.go:117] "RemoveContainer" containerID="258e8c83fc2dd9e9c165c147a3085d310c4de5d771038f237098b4b3a09178a8" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.475612 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.475694 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.475737 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.475771 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.476982 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.477156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.477576 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.477641 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579712 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579879 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.579964 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580015 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580079 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580388 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.580434 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.581010 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.587180 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.587581 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.588592 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.596141 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.601092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.607957 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.613835 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:18:43 crc kubenswrapper[4816]: W0311 12:18:43.737259 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09ce1ef6_fcd0_4182_afca_22c5892b48e2.slice/crio-fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5 WatchSource:0}: Error finding container fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5: Status 404 returned error can't find the container with id fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5 Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.751669 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.864736 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w8rqc" event={"ID":"09ce1ef6-fcd0-4182-afca-22c5892b48e2","Type":"ContainerStarted","Data":"fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5"} Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.869442 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerStarted","Data":"ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed"} Mar 11 12:18:43 crc kubenswrapper[4816]: I0311 12:18:43.876468 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.146846 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" path="/var/lib/kubelet/pods/79c46d79-aa47-428c-abec-a6f94c66e9ab/volumes" Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.147768 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7206357-ec52-4320-b659-a027694a74a9" path="/var/lib/kubelet/pods/d7206357-ec52-4320-b659-a027694a74a9/volumes" Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.148742 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e391eaa0-5fb5-4ab1-a9a5-b480703f8b85" path="/var/lib/kubelet/pods/e391eaa0-5fb5-4ab1-a9a5-b480703f8b85/volumes" Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.222936 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:18:44 crc kubenswrapper[4816]: W0311 12:18:44.606507 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod439b686e_927d_425a_a218_807220ae1e95.slice/crio-d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d WatchSource:0}: Error finding container d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d: Status 404 returned error can't find the container with id d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.908277 4816 generic.go:334] "Generic (PLEG): container finished" podID="3ae20611-891b-49ee-b5b8-0dad8af80906" containerID="315146a94731475a01dc83fd91cffc1dc07e3b8364e3b5f9f4c74f1dffcbe0c4" exitCode=0 Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.908697 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tdv64" event={"ID":"3ae20611-891b-49ee-b5b8-0dad8af80906","Type":"ContainerDied","Data":"315146a94731475a01dc83fd91cffc1dc07e3b8364e3b5f9f4c74f1dffcbe0c4"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.927284 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerStarted","Data":"d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.963973 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w8rqc" event={"ID":"09ce1ef6-fcd0-4182-afca-22c5892b48e2","Type":"ContainerStarted","Data":"619b2be9e8a3f61b134d163bc3ebb4105259f3d6eadad7ea8f76de2333bbeac4"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.968797 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerStarted","Data":"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.968843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerStarted","Data":"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b"} Mar 11 12:18:44 crc kubenswrapper[4816]: I0311 12:18:44.988398 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-w8rqc" podStartSLOduration=2.9883773319999998 podStartE2EDuration="2.988377332s" podCreationTimestamp="2026-03-11 12:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:44.987247229 +0000 UTC m=+1211.578511186" watchObservedRunningTime="2026-03-11 12:18:44.988377332 +0000 UTC m=+1211.579641299" Mar 11 12:18:45 crc kubenswrapper[4816]: I0311 12:18:45.667680 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67754df655-hhz24" podUID="79c46d79-aa47-428c-abec-a6f94c66e9ab" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.024146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerStarted","Data":"4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67"} Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.030577 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde"} Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.389822 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.409789 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.409770955 podStartE2EDuration="7.409770955s" podCreationTimestamp="2026-03-11 12:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:45.022572336 +0000 UTC m=+1211.613836303" watchObservedRunningTime="2026-03-11 12:18:46.409770955 +0000 UTC m=+1213.001034922" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.576958 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") pod \"3ae20611-891b-49ee-b5b8-0dad8af80906\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.577063 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") pod \"3ae20611-891b-49ee-b5b8-0dad8af80906\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.577132 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") pod \"3ae20611-891b-49ee-b5b8-0dad8af80906\" (UID: \"3ae20611-891b-49ee-b5b8-0dad8af80906\") " Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.586304 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g" (OuterVolumeSpecName: "kube-api-access-z8z8g") pod "3ae20611-891b-49ee-b5b8-0dad8af80906" (UID: "3ae20611-891b-49ee-b5b8-0dad8af80906"). InnerVolumeSpecName "kube-api-access-z8z8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.608789 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ae20611-891b-49ee-b5b8-0dad8af80906" (UID: "3ae20611-891b-49ee-b5b8-0dad8af80906"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.622982 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config" (OuterVolumeSpecName: "config") pod "3ae20611-891b-49ee-b5b8-0dad8af80906" (UID: "3ae20611-891b-49ee-b5b8-0dad8af80906"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.680140 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.680192 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae20611-891b-49ee-b5b8-0dad8af80906-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:46 crc kubenswrapper[4816]: I0311 12:18:46.680209 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8z8g\" (UniqueName: \"kubernetes.io/projected/3ae20611-891b-49ee-b5b8-0dad8af80906-kube-api-access-z8z8g\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.042548 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerStarted","Data":"ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea"} Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.047033 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tdv64" event={"ID":"3ae20611-891b-49ee-b5b8-0dad8af80906","Type":"ContainerDied","Data":"407febd9600a7f2ac248ee17af289c238ad46396cc3e57692031c09ed1d62622"} Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.047082 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="407febd9600a7f2ac248ee17af289c238ad46396cc3e57692031c09ed1d62622" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.047085 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tdv64" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.077920 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.077899755 podStartE2EDuration="4.077899755s" podCreationTimestamp="2026-03-11 12:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:47.076735401 +0000 UTC m=+1213.667999388" watchObservedRunningTime="2026-03-11 12:18:47.077899755 +0000 UTC m=+1213.669163722" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.189190 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:18:47 crc kubenswrapper[4816]: E0311 12:18:47.189624 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae20611-891b-49ee-b5b8-0dad8af80906" containerName="neutron-db-sync" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.189642 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae20611-891b-49ee-b5b8-0dad8af80906" containerName="neutron-db-sync" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.189810 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae20611-891b-49ee-b5b8-0dad8af80906" containerName="neutron-db-sync" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.190757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.208852 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.295551 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.300820 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.301728 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.301994 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.302237 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.302348 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.302380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.302456 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.303648 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.303915 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-48x47" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.304049 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.320873 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.371956 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404696 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404769 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404849 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404872 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.404964 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.405025 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.405057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406103 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406169 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406215 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406234 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406264 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.406991 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.407064 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.408223 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.444574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") pod \"dnsmasq-dns-6d67d65cb9-2w84f\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508736 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508823 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.508954 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.514043 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.515982 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.516023 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.520583 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.526809 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.533146 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") pod \"neutron-9df8757bb-rzb52\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:47 crc kubenswrapper[4816]: I0311 12:18:47.645098 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:48 crc kubenswrapper[4816]: I0311 12:18:48.069407 4816 generic.go:334] "Generic (PLEG): container finished" podID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" containerID="619b2be9e8a3f61b134d163bc3ebb4105259f3d6eadad7ea8f76de2333bbeac4" exitCode=0 Mar 11 12:18:48 crc kubenswrapper[4816]: I0311 12:18:48.070632 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w8rqc" event={"ID":"09ce1ef6-fcd0-4182-afca-22c5892b48e2","Type":"ContainerDied","Data":"619b2be9e8a3f61b134d163bc3ebb4105259f3d6eadad7ea8f76de2333bbeac4"} Mar 11 12:18:48 crc kubenswrapper[4816]: I0311 12:18:48.081547 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:18:48 crc kubenswrapper[4816]: W0311 12:18:48.422005 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68498f16_b5c3_4960_8565_7ae628fc3122.slice/crio-ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09 WatchSource:0}: Error finding container ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09: Status 404 returned error can't find the container with id ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09 Mar 11 12:18:48 crc kubenswrapper[4816]: I0311 12:18:48.426970 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.113123 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerStarted","Data":"385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7"} Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.113635 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerStarted","Data":"ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09"} Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.119464 4816 generic.go:334] "Generic (PLEG): container finished" podID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerID="f365943c3bcd25f6e7decbae194b1841bae20d3a5ca1816848e3f40bb39b1c41" exitCode=0 Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.121924 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerDied","Data":"f365943c3bcd25f6e7decbae194b1841bae20d3a5ca1816848e3f40bb39b1c41"} Mar 11 12:18:49 crc kubenswrapper[4816]: I0311 12:18:49.121976 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerStarted","Data":"88fbfecb363955a5809ac97d8f060b762763c43823be00b756985a39bffcbe7e"} Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.161086 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.163241 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.170932 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.175329 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.178046 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.195948 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.196189 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.235122 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.255342 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282561 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282631 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282686 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282738 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282764 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282804 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.282838 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.384866 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.384987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385030 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385111 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385206 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385238 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.385311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.393436 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.394212 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.394450 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.395064 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.396054 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.405967 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.409177 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") pod \"neutron-64584d7649-mb6k8\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:50 crc kubenswrapper[4816]: I0311 12:18:50.488898 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:51 crc kubenswrapper[4816]: I0311 12:18:51.156425 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:51 crc kubenswrapper[4816]: I0311 12:18:51.157094 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.137651 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259228 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259718 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259777 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.259962 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.260029 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") pod \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\" (UID: \"09ce1ef6-fcd0-4182-afca-22c5892b48e2\") " Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.265003 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w8rqc" event={"ID":"09ce1ef6-fcd0-4182-afca-22c5892b48e2","Type":"ContainerDied","Data":"fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5"} Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.265069 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb98e4336d682d875b7e94b2c10df2c35624f0d26db0436c3d5f3cac73ca09c5" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.265176 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w8rqc" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.278228 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts" (OuterVolumeSpecName: "scripts") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.281533 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.282568 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q" (OuterVolumeSpecName: "kube-api-access-bdq6q") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "kube-api-access-bdq6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.294111 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.358421 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data" (OuterVolumeSpecName: "config-data") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363112 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdq6q\" (UniqueName: \"kubernetes.io/projected/09ce1ef6-fcd0-4182-afca-22c5892b48e2-kube-api-access-bdq6q\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363149 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363166 4816 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363178 4816 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.363191 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.370335 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09ce1ef6-fcd0-4182-afca-22c5892b48e2" (UID: "09ce1ef6-fcd0-4182-afca-22c5892b48e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.465552 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ce1ef6-fcd0-4182-afca-22c5892b48e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.614704 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.614799 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.649385 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.671203 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.702192 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.702353 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.716449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 12:18:53 crc kubenswrapper[4816]: W0311 12:18:53.759935 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd930e1b_a508_4a64_8825_9800b8010d59.slice/crio-0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f WatchSource:0}: Error finding container 0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f: Status 404 returned error can't find the container with id 0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f Mar 11 12:18:53 crc kubenswrapper[4816]: I0311 12:18:53.782473 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.283733 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerStarted","Data":"06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.284172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerStarted","Data":"0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.310764 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerStarted","Data":"3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.311162 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.313668 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:18:54 crc kubenswrapper[4816]: E0311 12:18:54.316036 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" containerName="keystone-bootstrap" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.316069 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" containerName="keystone-bootstrap" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.316318 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" containerName="keystone-bootstrap" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.317269 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.323922 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerStarted","Data":"ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324155 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324375 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324496 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324512 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324381 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.324772 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.325034 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8k5jj" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.348056 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.408563 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4b4ms" event={"ID":"f92c8acc-1a4a-4f28-a123-2f5b8b6905af","Type":"ContainerStarted","Data":"1f2178fe24813df8bfcc542c32d18ec7c0d7ab550dc406623e692f0465cd6535"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412058 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412161 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412223 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412357 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.412785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.413175 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.413211 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.434745 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0"} Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.437064 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.437106 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.478609 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9df8757bb-rzb52" podStartSLOduration=7.478574828 podStartE2EDuration="7.478574828s" podCreationTimestamp="2026-03-11 12:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:54.338752152 +0000 UTC m=+1220.930016119" watchObservedRunningTime="2026-03-11 12:18:54.478574828 +0000 UTC m=+1221.069838795" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518617 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518828 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518880 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518908 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518930 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.518961 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.519023 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.519139 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.535210 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.542902 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.543222 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.544917 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.552817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.566290 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.577395 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" podStartSLOduration=7.5773635729999995 podStartE2EDuration="7.577363573s" podCreationTimestamp="2026-03-11 12:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:54.477871548 +0000 UTC m=+1221.069135505" watchObservedRunningTime="2026-03-11 12:18:54.577363573 +0000 UTC m=+1221.168627540" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.592974 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.594388 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") pod \"keystone-5d6ddcd789-qjf9c\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.643916 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4b4ms" podStartSLOduration=2.4001057230000002 podStartE2EDuration="41.643891649s" podCreationTimestamp="2026-03-11 12:18:13 +0000 UTC" firstStartedPulling="2026-03-11 12:18:14.468391173 +0000 UTC m=+1181.059655140" lastFinishedPulling="2026-03-11 12:18:53.712177089 +0000 UTC m=+1220.303441066" observedRunningTime="2026-03-11 12:18:54.57169064 +0000 UTC m=+1221.162954607" watchObservedRunningTime="2026-03-11 12:18:54.643891649 +0000 UTC m=+1221.235155616" Mar 11 12:18:54 crc kubenswrapper[4816]: I0311 12:18:54.751695 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.402566 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.461371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerStarted","Data":"ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d"} Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.461770 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.465092 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d6ddcd789-qjf9c" event={"ID":"9c180505-72c6-498d-bfa5-05f689692bd2","Type":"ContainerStarted","Data":"210d5da4467eeb407cc3db147ba87bbb3dfcf68d3ca56b768383a1d9ec2cdc8a"} Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.475534 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjmnw" event={"ID":"2772ef82-fe14-4f4d-8349-8ee515e39979","Type":"ContainerStarted","Data":"c3956854978860cbc650270e665106bd8e95400d5b8cce00a86ed500eb262922"} Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.501712 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-64584d7649-mb6k8" podStartSLOduration=5.501690482 podStartE2EDuration="5.501690482s" podCreationTimestamp="2026-03-11 12:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:55.492553319 +0000 UTC m=+1222.083817286" watchObservedRunningTime="2026-03-11 12:18:55.501690482 +0000 UTC m=+1222.092954449" Mar 11 12:18:55 crc kubenswrapper[4816]: I0311 12:18:55.552470 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fjmnw" podStartSLOduration=4.303717687 podStartE2EDuration="43.552431964s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="2026-03-11 12:18:14.461362031 +0000 UTC m=+1181.052625998" lastFinishedPulling="2026-03-11 12:18:53.710076308 +0000 UTC m=+1220.301340275" observedRunningTime="2026-03-11 12:18:55.513380349 +0000 UTC m=+1222.104644326" watchObservedRunningTime="2026-03-11 12:18:55.552431964 +0000 UTC m=+1222.143695931" Mar 11 12:18:56 crc kubenswrapper[4816]: I0311 12:18:56.488811 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d6ddcd789-qjf9c" event={"ID":"9c180505-72c6-498d-bfa5-05f689692bd2","Type":"ContainerStarted","Data":"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8"} Mar 11 12:18:56 crc kubenswrapper[4816]: I0311 12:18:56.489316 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:18:56 crc kubenswrapper[4816]: I0311 12:18:56.488846 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:18:56 crc kubenswrapper[4816]: I0311 12:18:56.489356 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.397981 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.402272 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.421195 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5d6ddcd789-qjf9c" podStartSLOduration=3.42117205 podStartE2EDuration="3.42117205s" podCreationTimestamp="2026-03-11 12:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:18:56.515419336 +0000 UTC m=+1223.106683303" watchObservedRunningTime="2026-03-11 12:18:57.42117205 +0000 UTC m=+1224.012436017" Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.508644 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rjxsf" event={"ID":"c643aa04-ce8d-4c3b-befc-ecdf63e35de8","Type":"ContainerStarted","Data":"c704df83b8c052d797cc33017726ade79e749840ea39268bdd3404b42194d40d"} Mar 11 12:18:57 crc kubenswrapper[4816]: I0311 12:18:57.523394 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rjxsf" podStartSLOduration=3.688778427 podStartE2EDuration="45.523370793s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="2026-03-11 12:18:14.892076393 +0000 UTC m=+1181.483340360" lastFinishedPulling="2026-03-11 12:18:56.726668759 +0000 UTC m=+1223.317932726" observedRunningTime="2026-03-11 12:18:57.521733526 +0000 UTC m=+1224.112997493" watchObservedRunningTime="2026-03-11 12:18:57.523370793 +0000 UTC m=+1224.114634760" Mar 11 12:18:58 crc kubenswrapper[4816]: I0311 12:18:58.524349 4816 generic.go:334] "Generic (PLEG): container finished" podID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" containerID="1f2178fe24813df8bfcc542c32d18ec7c0d7ab550dc406623e692f0465cd6535" exitCode=0 Mar 11 12:18:58 crc kubenswrapper[4816]: I0311 12:18:58.524458 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4b4ms" event={"ID":"f92c8acc-1a4a-4f28-a123-2f5b8b6905af","Type":"ContainerDied","Data":"1f2178fe24813df8bfcc542c32d18ec7c0d7ab550dc406623e692f0465cd6535"} Mar 11 12:19:01 crc kubenswrapper[4816]: I0311 12:19:01.554968 4816 generic.go:334] "Generic (PLEG): container finished" podID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" containerID="c704df83b8c052d797cc33017726ade79e749840ea39268bdd3404b42194d40d" exitCode=0 Mar 11 12:19:01 crc kubenswrapper[4816]: I0311 12:19:01.555094 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rjxsf" event={"ID":"c643aa04-ce8d-4c3b-befc-ecdf63e35de8","Type":"ContainerDied","Data":"c704df83b8c052d797cc33017726ade79e749840ea39268bdd3404b42194d40d"} Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.523428 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.601654 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.602376 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" containerID="cri-o://ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c" gracePeriod=10 Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.605558 4816 generic.go:334] "Generic (PLEG): container finished" podID="2772ef82-fe14-4f4d-8349-8ee515e39979" containerID="c3956854978860cbc650270e665106bd8e95400d5b8cce00a86ed500eb262922" exitCode=0 Mar 11 12:19:02 crc kubenswrapper[4816]: I0311 12:19:02.605743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjmnw" event={"ID":"2772ef82-fe14-4f4d-8349-8ee515e39979","Type":"ContainerDied","Data":"c3956854978860cbc650270e665106bd8e95400d5b8cce00a86ed500eb262922"} Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.062344 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.069326 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4b4ms" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248011 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") pod \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248116 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248230 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") pod \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248276 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") pod \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\" (UID: \"c643aa04-ce8d-4c3b-befc-ecdf63e35de8\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248410 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248431 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.248474 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") pod \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\" (UID: \"f92c8acc-1a4a-4f28-a123-2f5b8b6905af\") " Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.262950 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67" (OuterVolumeSpecName: "kube-api-access-5zq67") pod "c643aa04-ce8d-4c3b-befc-ecdf63e35de8" (UID: "c643aa04-ce8d-4c3b-befc-ecdf63e35de8"). InnerVolumeSpecName "kube-api-access-5zq67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.263073 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c643aa04-ce8d-4c3b-befc-ecdf63e35de8" (UID: "c643aa04-ce8d-4c3b-befc-ecdf63e35de8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.264445 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc" (OuterVolumeSpecName: "kube-api-access-m6lqc") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "kube-api-access-m6lqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.264700 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs" (OuterVolumeSpecName: "logs") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.288121 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts" (OuterVolumeSpecName: "scripts") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350556 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zq67\" (UniqueName: \"kubernetes.io/projected/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-kube-api-access-5zq67\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350616 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350630 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6lqc\" (UniqueName: \"kubernetes.io/projected/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-kube-api-access-m6lqc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350642 4816 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.350654 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.374422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data" (OuterVolumeSpecName: "config-data") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.389531 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c643aa04-ce8d-4c3b-befc-ecdf63e35de8" (UID: "c643aa04-ce8d-4c3b-befc-ecdf63e35de8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.404513 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f92c8acc-1a4a-4f28-a123-2f5b8b6905af" (UID: "f92c8acc-1a4a-4f28-a123-2f5b8b6905af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.454458 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c643aa04-ce8d-4c3b-befc-ecdf63e35de8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.454498 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.454509 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92c8acc-1a4a-4f28-a123-2f5b8b6905af-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.621703 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rjxsf" event={"ID":"c643aa04-ce8d-4c3b-befc-ecdf63e35de8","Type":"ContainerDied","Data":"1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2"} Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.621749 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e6a18d4f0b251cb2f7727ad5be471c642eca99dd05bbcec781288abe852fcc2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.621823 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rjxsf" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.626632 4816 generic.go:334] "Generic (PLEG): container finished" podID="ad047cd1-309a-401e-9fc6-cb1349614136" containerID="ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c" exitCode=0 Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.626732 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerDied","Data":"ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c"} Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.628753 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4b4ms" event={"ID":"f92c8acc-1a4a-4f28-a123-2f5b8b6905af","Type":"ContainerDied","Data":"31e496272578b057f389702c22da6db4b04713d9b39444d9f2071398a63be537"} Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.628795 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4b4ms" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.628798 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31e496272578b057f389702c22da6db4b04713d9b39444d9f2071398a63be537" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.801163 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:19:03 crc kubenswrapper[4816]: E0311 12:19:03.802072 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" containerName="placement-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.802089 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" containerName="placement-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: E0311 12:19:03.802121 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" containerName="barbican-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.802127 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" containerName="barbican-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.802333 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" containerName="barbican-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.802351 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" containerName="placement-db-sync" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.803382 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.806857 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.807216 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.808038 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fxmtd" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.811899 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.813677 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.821605 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.833761 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865609 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865649 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865689 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.865812 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.873944 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.963053 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.965060 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972828 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972865 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972900 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972922 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.972968 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973004 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973041 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973057 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973089 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973112 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973134 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973154 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973182 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973204 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.973532 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.988590 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.989147 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.993576 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:03 crc kubenswrapper[4816]: I0311 12:19:03.997572 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:03.998543 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") pod \"barbican-keystone-listener-59b4f4d478-5b797\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077650 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077747 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077786 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077832 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077905 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077943 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077972 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.077998 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.078036 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.078064 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.079150 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.079793 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.082691 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.085162 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.092742 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.095401 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.099036 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.099229 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.103600 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.113659 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") pod \"dnsmasq-dns-7fc46d7df7-2frp2\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.117855 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") pod \"barbican-worker-855897fd55-t7sfb\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.141101 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.150457 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.172322 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.211442 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.219672 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.222523 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.237003 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.309883 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.312280 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316194 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l2nzr" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316521 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316671 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316814 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.316928 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.322998 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.364066 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.368097 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.393570 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.393672 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.393748 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.393948 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.394025 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496027 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496098 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496221 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496304 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496363 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496442 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496492 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496546 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496729 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") pod \"ad047cd1-309a-401e-9fc6-cb1349614136\" (UID: \"ad047cd1-309a-401e-9fc6-cb1349614136\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496797 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.496844 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") pod \"2772ef82-fe14-4f4d-8349-8ee515e39979\" (UID: \"2772ef82-fe14-4f4d-8349-8ee515e39979\") " Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498313 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498394 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498425 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498524 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498610 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498651 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498713 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498832 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498885 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.498946 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.499003 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.499057 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.501356 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.502814 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.506756 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts" (OuterVolumeSpecName: "scripts") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.507873 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp" (OuterVolumeSpecName: "kube-api-access-5n8zp") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "kube-api-access-5n8zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.508352 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.512848 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc" (OuterVolumeSpecName: "kube-api-access-qp8jc") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "kube-api-access-qp8jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.515727 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.518798 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.528690 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.529957 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") pod \"barbican-api-5d4754df76-xnl78\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.549886 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.558402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.597035 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.597240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600571 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600614 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600651 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600705 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600734 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600773 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600806 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600859 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600869 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600879 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2772ef82-fe14-4f4d-8349-8ee515e39979-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600891 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp8jc\" (UniqueName: \"kubernetes.io/projected/ad047cd1-309a-401e-9fc6-cb1349614136-kube-api-access-qp8jc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600900 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600910 4816 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600919 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.600927 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n8zp\" (UniqueName: \"kubernetes.io/projected/2772ef82-fe14-4f4d-8349-8ee515e39979-kube-api-access-5n8zp\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.607319 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.625884 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.631359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.632811 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.634797 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.635710 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.644825 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") pod \"placement-5ffd6fb588-7hftz\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.680177 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" event={"ID":"ad047cd1-309a-401e-9fc6-cb1349614136","Type":"ContainerDied","Data":"7732a86e8dc12bafbe8cdaac586dd615d3b76e080ef096246aeda54dd0e49383"} Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.681052 4816 scope.go:117] "RemoveContainer" containerID="ccafb95fbf3f12326123ae581a70f3b9eefd2d320c697240864a31290ea2a66c" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.681488 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.689961 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjmnw" event={"ID":"2772ef82-fe14-4f4d-8349-8ee515e39979","Type":"ContainerDied","Data":"f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422"} Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.690002 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f11050b66cf18643ca807dd8a6fddbe1c30160c5ebaa861b516a6a0d311fa422" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.690054 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjmnw" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.736826 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config" (OuterVolumeSpecName: "config") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.736956 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.737155 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad047cd1-309a-401e-9fc6-cb1349614136" (UID: "ad047cd1-309a-401e-9fc6-cb1349614136"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.743374 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data" (OuterVolumeSpecName: "config-data") pod "2772ef82-fe14-4f4d-8349-8ee515e39979" (UID: "2772ef82-fe14-4f4d-8349-8ee515e39979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.805220 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.805636 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.805647 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2772ef82-fe14-4f4d-8349-8ee515e39979-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.805657 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad047cd1-309a-401e-9fc6-cb1349614136-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:04 crc kubenswrapper[4816]: I0311 12:19:04.932835 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.025861 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:05 crc kubenswrapper[4816]: E0311 12:19:05.026477 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026502 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" Mar 11 12:19:05 crc kubenswrapper[4816]: E0311 12:19:05.026552 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" containerName="cinder-db-sync" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026563 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" containerName="cinder-db-sync" Mar 11 12:19:05 crc kubenswrapper[4816]: E0311 12:19:05.026578 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="init" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026587 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="init" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026836 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" containerName="cinder-db-sync" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.026886 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.028369 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.037123 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.037532 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.037565 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6qw4t" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.037871 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.067693 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.104601 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119187 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119240 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119321 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119346 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119369 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119407 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.119504 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.135470 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-2nfvt"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.147231 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.148935 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.166338 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.192641 4816 scope.go:117] "RemoveContainer" containerID="f842cab6fbb753d4036f93abfc735f41fd91ab93fdcba8e10b330c30d7aa8346" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.201682 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221491 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221552 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221634 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221657 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221680 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.221719 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.222106 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.232394 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.233196 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.237753 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.238296 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.239467 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.249053 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") pod \"cinder-scheduler-0\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.249186 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:19:05 crc kubenswrapper[4816]: W0311 12:19:05.254449 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd535a1_7585_4cb7_94ec_f4b98b10be4a.slice/crio-84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28 WatchSource:0}: Error finding container 84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28: Status 404 returned error can't find the container with id 84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28 Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.326503 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337025 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337385 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337431 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337467 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.337499 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.379833 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.380337 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.393176 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.399933 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.400516 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443437 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443511 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443564 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443596 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443624 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443675 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443705 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443720 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443736 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443774 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443792 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.443811 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.444866 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.445545 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.446581 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.447351 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.450629 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.467701 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") pod \"dnsmasq-dns-58b85ccffc-7gcck\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546290 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546960 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.547002 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.547092 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.547156 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.546475 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.548515 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.555336 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.559784 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.562854 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.563436 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.573841 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") pod \"cinder-api-0\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.614514 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.732115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" event={"ID":"3bd40d51-3ead-4137-9b14-2a93f44f4166","Type":"ContainerStarted","Data":"e340351756a9ee01fd1961ba595c2cad8bbf26c5f081172dd9ef510e6ebc5cd5"} Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.737916 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.758807 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerStarted","Data":"65e8dd7e6335c0228a44e94f23c28e5cede1dd965bd20e6b4cf61bc69bb5386a"} Mar 11 12:19:05 crc kubenswrapper[4816]: I0311 12:19:05.760080 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerStarted","Data":"84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28"} Mar 11 12:19:05 crc kubenswrapper[4816]: W0311 12:19:05.984747 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba61db44_272d_4f1c_b3c6_d3fe1edb38bd.slice/crio-9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f WatchSource:0}: Error finding container 9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f: Status 404 returned error can't find the container with id 9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.020845 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.167585 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" path="/var/lib/kubelet/pods/ad047cd1-309a-401e-9fc6-cb1349614136/volumes" Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.169129 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.227793 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:06 crc kubenswrapper[4816]: W0311 12:19:06.234969 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8f76a92_4234_474b_bca2_f5d9cbbec8f2.slice/crio-965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf WatchSource:0}: Error finding container 965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf: Status 404 returned error can't find the container with id 965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.678847 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.711523 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:19:06 crc kubenswrapper[4816]: W0311 12:19:06.756266 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f7f295b_c30d_49a7_b5fa_b1ae8f705589.slice/crio-22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec WatchSource:0}: Error finding container 22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec: Status 404 returned error can't find the container with id 22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.799547 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerStarted","Data":"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.799818 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-central-agent" containerID="cri-o://2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" gracePeriod=30 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.800339 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.800713 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-notification-agent" containerID="cri-o://785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" gracePeriod=30 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.800762 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="sg-core" containerID="cri-o://fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" gracePeriod=30 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.800812 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="proxy-httpd" containerID="cri-o://56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" gracePeriod=30 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.816636 4816 generic.go:334] "Generic (PLEG): container finished" podID="3bd40d51-3ead-4137-9b14-2a93f44f4166" containerID="976f0996d7f32cec3b2ed81142b0919faa0b23eb7e4ba00fa314a16f1166512f" exitCode=0 Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.816861 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" event={"ID":"3bd40d51-3ead-4137-9b14-2a93f44f4166","Type":"ContainerDied","Data":"976f0996d7f32cec3b2ed81142b0919faa0b23eb7e4ba00fa314a16f1166512f"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.826326 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.272447595 podStartE2EDuration="54.826308036s" podCreationTimestamp="2026-03-11 12:18:12 +0000 UTC" firstStartedPulling="2026-03-11 12:18:14.841785745 +0000 UTC m=+1181.433049712" lastFinishedPulling="2026-03-11 12:19:05.395646186 +0000 UTC m=+1231.986910153" observedRunningTime="2026-03-11 12:19:06.821695334 +0000 UTC m=+1233.412959301" watchObservedRunningTime="2026-03-11 12:19:06.826308036 +0000 UTC m=+1233.417572003" Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.837197 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerStarted","Data":"5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.837273 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerStarted","Data":"9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.869136 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerStarted","Data":"22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.876026 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerStarted","Data":"d1d101cb43433bc7eb7c833f258e91530ee7e5c09a0712cf4851d690643adb2a"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.882515 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerStarted","Data":"584cd4107522305bdba692719070a92eec3324ee2da427663b64c0c877cbea0c"} Mar 11 12:19:06 crc kubenswrapper[4816]: I0311 12:19:06.887961 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerStarted","Data":"965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.620269 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.705684 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.705744 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.705868 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.705928 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.706038 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.706074 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") pod \"3bd40d51-3ead-4137-9b14-2a93f44f4166\" (UID: \"3bd40d51-3ead-4137-9b14-2a93f44f4166\") " Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.718128 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx" (OuterVolumeSpecName: "kube-api-access-ltmfx") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "kube-api-access-ltmfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.809031 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.821133 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltmfx\" (UniqueName: \"kubernetes.io/projected/3bd40d51-3ead-4137-9b14-2a93f44f4166-kube-api-access-ltmfx\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.835695 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.881822 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.884944 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.910764 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerID="56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" exitCode=0 Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.910993 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerID="fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" exitCode=2 Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.911005 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerID="2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" exitCode=0 Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.910962 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.911064 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.911074 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.913847 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" event={"ID":"3bd40d51-3ead-4137-9b14-2a93f44f4166","Type":"ContainerDied","Data":"e340351756a9ee01fd1961ba595c2cad8bbf26c5f081172dd9ef510e6ebc5cd5"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.913889 4816 scope.go:117] "RemoveContainer" containerID="976f0996d7f32cec3b2ed81142b0919faa0b23eb7e4ba00fa314a16f1166512f" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.914049 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-2frp2" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.915884 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config" (OuterVolumeSpecName: "config") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.917363 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerStarted","Data":"eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.918371 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.918399 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.923018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerStarted","Data":"3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f"} Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.923602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3bd40d51-3ead-4137-9b14-2a93f44f4166" (UID: "3bd40d51-3ead-4137-9b14-2a93f44f4166"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928332 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928370 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928387 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928399 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.928409 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bd40d51-3ead-4137-9b14-2a93f44f4166-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:07 crc kubenswrapper[4816]: I0311 12:19:07.950690 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d4754df76-xnl78" podStartSLOduration=3.950660793 podStartE2EDuration="3.950660793s" podCreationTimestamp="2026-03-11 12:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:07.940888024 +0000 UTC m=+1234.532151991" watchObservedRunningTime="2026-03-11 12:19:07.950660793 +0000 UTC m=+1234.541924760" Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.287551 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.287819 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-2frp2"] Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.863408 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-759cc7f497-2nfvt" podUID="ad047cd1-309a-401e-9fc6-cb1349614136" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: i/o timeout" Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.939301 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerID="93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603" exitCode=0 Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.939366 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerDied","Data":"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603"} Mar 11 12:19:08 crc kubenswrapper[4816]: I0311 12:19:08.942414 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerStarted","Data":"346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.514926 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.515461 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.515516 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.516478 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.516529 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96" gracePeriod=600 Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.959678 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerStarted","Data":"6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.961335 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.961357 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.963420 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerStarted","Data":"cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.967411 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerStarted","Data":"4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.967450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerStarted","Data":"f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.970988 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96" exitCode=0 Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.971458 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.971496 4816 scope.go:117] "RemoveContainer" containerID="13e7eed3f44dcb7bba59d21f6a1bb4bc9f4b869b7a25106a79ff8ceef1b9e507" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.983484 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5ffd6fb588-7hftz" podStartSLOduration=5.983466058 podStartE2EDuration="5.983466058s" podCreationTimestamp="2026-03-11 12:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:09.982970864 +0000 UTC m=+1236.574234841" watchObservedRunningTime="2026-03-11 12:19:09.983466058 +0000 UTC m=+1236.574730015" Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.989800 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerStarted","Data":"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207"} Mar 11 12:19:09 crc kubenswrapper[4816]: I0311 12:19:09.989839 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerStarted","Data":"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85"} Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.000354 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerStarted","Data":"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2"} Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.002491 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.022372 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api-log" containerID="cri-o://346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd" gracePeriod=30 Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.022656 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerStarted","Data":"0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193"} Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.023163 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.022704 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" containerID="cri-o://0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193" gracePeriod=30 Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.044568 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-855897fd55-t7sfb" podStartSLOduration=3.327157953 podStartE2EDuration="7.044548663s" podCreationTimestamp="2026-03-11 12:19:03 +0000 UTC" firstStartedPulling="2026-03-11 12:19:05.305465149 +0000 UTC m=+1231.896729116" lastFinishedPulling="2026-03-11 12:19:09.022855839 +0000 UTC m=+1235.614119826" observedRunningTime="2026-03-11 12:19:10.020457535 +0000 UTC m=+1236.611721502" watchObservedRunningTime="2026-03-11 12:19:10.044548663 +0000 UTC m=+1236.635812630" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.067041 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" podStartSLOduration=3.354487383 podStartE2EDuration="7.066992004s" podCreationTimestamp="2026-03-11 12:19:03 +0000 UTC" firstStartedPulling="2026-03-11 12:19:05.310168493 +0000 UTC m=+1231.901432460" lastFinishedPulling="2026-03-11 12:19:09.022673114 +0000 UTC m=+1235.613937081" observedRunningTime="2026-03-11 12:19:10.042184986 +0000 UTC m=+1236.633448953" watchObservedRunningTime="2026-03-11 12:19:10.066992004 +0000 UTC m=+1236.658255961" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.166975 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd40d51-3ead-4137-9b14-2a93f44f4166" path="/var/lib/kubelet/pods/3bd40d51-3ead-4137-9b14-2a93f44f4166/volumes" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.171767 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.171741748 podStartE2EDuration="5.171741748s" podCreationTimestamp="2026-03-11 12:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:10.143806169 +0000 UTC m=+1236.735070146" watchObservedRunningTime="2026-03-11 12:19:10.171741748 +0000 UTC m=+1236.763005715" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.175361 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" podStartSLOduration=5.17534163 podStartE2EDuration="5.17534163s" podCreationTimestamp="2026-03-11 12:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:10.101699816 +0000 UTC m=+1236.692963803" watchObservedRunningTime="2026-03-11 12:19:10.17534163 +0000 UTC m=+1236.766605597" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.375360 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:19:10 crc kubenswrapper[4816]: E0311 12:19:10.376622 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd40d51-3ead-4137-9b14-2a93f44f4166" containerName="init" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.376781 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd40d51-3ead-4137-9b14-2a93f44f4166" containerName="init" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.377271 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd40d51-3ead-4137-9b14-2a93f44f4166" containerName="init" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.378767 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.385970 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.386558 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.423015 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.543848 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.543906 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544050 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544128 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544153 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544202 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.544222 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646066 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646126 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646226 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646309 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646330 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646376 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.646393 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.647033 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.659815 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.660358 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.660907 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.670170 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.685545 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.693836 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") pod \"barbican-api-64b59f8d4-2vxd9\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.743689 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.763004 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852501 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852555 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852609 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.852853 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") pod \"1ebe3f2a-5719-412c-8803-15e1bec74523\" (UID: \"1ebe3f2a-5719-412c-8803-15e1bec74523\") " Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.853742 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.854520 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.855193 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.865468 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx" (OuterVolumeSpecName: "kube-api-access-wnpcx") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "kube-api-access-wnpcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.865946 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts" (OuterVolumeSpecName: "scripts") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.927984 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.956381 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnpcx\" (UniqueName: \"kubernetes.io/projected/1ebe3f2a-5719-412c-8803-15e1bec74523-kube-api-access-wnpcx\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.956429 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.956441 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.956457 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ebe3f2a-5719-412c-8803-15e1bec74523-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.960074 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:10 crc kubenswrapper[4816]: I0311 12:19:10.986417 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data" (OuterVolumeSpecName: "config-data") pod "1ebe3f2a-5719-412c-8803-15e1bec74523" (UID: "1ebe3f2a-5719-412c-8803-15e1bec74523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.032909 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a"} Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.036025 4816 generic.go:334] "Generic (PLEG): container finished" podID="43eac2c3-bace-4682-b48e-f063d6653733" containerID="346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd" exitCode=143 Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.036099 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerDied","Data":"346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd"} Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.043623 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerID="785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" exitCode=0 Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.044370 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde"} Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.044421 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ebe3f2a-5719-412c-8803-15e1bec74523","Type":"ContainerDied","Data":"8ef644ece8e49d46c6b3d18fe5c5f96913f607e9b6a202c08e5f7ee442c27c93"} Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.044446 4816 scope.go:117] "RemoveContainer" containerID="56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.044674 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.059081 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.059126 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebe3f2a-5719-412c-8803-15e1bec74523-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.145491 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.158845 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171066 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:11 crc kubenswrapper[4816]: E0311 12:19:11.171506 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-central-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171526 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-central-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: E0311 12:19:11.171539 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-notification-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171546 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-notification-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: E0311 12:19:11.171564 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="sg-core" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171572 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="sg-core" Mar 11 12:19:11 crc kubenswrapper[4816]: E0311 12:19:11.171588 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="proxy-httpd" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171594 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="proxy-httpd" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171762 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-central-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171794 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="ceilometer-notification-agent" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171812 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="sg-core" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.171823 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" containerName="proxy-httpd" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.173452 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.176375 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.177270 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.188672 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.265785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.265893 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.265940 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.265983 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.266011 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.266066 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.266229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367737 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367828 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367852 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367871 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367903 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.367986 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.368535 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.369260 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.375714 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.376788 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.377049 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.377869 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.404963 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") pod \"ceilometer-0\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.516798 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:11 crc kubenswrapper[4816]: I0311 12:19:11.895117 4816 scope.go:117] "RemoveContainer" containerID="fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.025529 4816 scope.go:117] "RemoveContainer" containerID="785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.140843 4816 scope.go:117] "RemoveContainer" containerID="2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.187163 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ebe3f2a-5719-412c-8803-15e1bec74523" path="/var/lib/kubelet/pods/1ebe3f2a-5719-412c-8803-15e1bec74523/volumes" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.187915 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerStarted","Data":"e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778"} Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.234496 4816 scope.go:117] "RemoveContainer" containerID="56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" Mar 11 12:19:12 crc kubenswrapper[4816]: E0311 12:19:12.242755 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a\": container with ID starting with 56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a not found: ID does not exist" containerID="56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.242813 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a"} err="failed to get container status \"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a\": rpc error: code = NotFound desc = could not find container \"56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a\": container with ID starting with 56bec0a6969c2a35ea2359b8b9e2a0d4a80229380dac2fa1a5de2e56cab22e4a not found: ID does not exist" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.242886 4816 scope.go:117] "RemoveContainer" containerID="fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" Mar 11 12:19:12 crc kubenswrapper[4816]: E0311 12:19:12.243183 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0\": container with ID starting with fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0 not found: ID does not exist" containerID="fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.243205 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0"} err="failed to get container status \"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0\": rpc error: code = NotFound desc = could not find container \"fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0\": container with ID starting with fd9f9269933cdf626acdae81b166112cef4742071274667ac737f1fb43d6eaa0 not found: ID does not exist" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.243225 4816 scope.go:117] "RemoveContainer" containerID="785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" Mar 11 12:19:12 crc kubenswrapper[4816]: E0311 12:19:12.243422 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde\": container with ID starting with 785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde not found: ID does not exist" containerID="785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.243474 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde"} err="failed to get container status \"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde\": rpc error: code = NotFound desc = could not find container \"785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde\": container with ID starting with 785459bf5361b538fca731d5c9459763253d45826f0befba834e333e7e6a0dde not found: ID does not exist" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.243499 4816 scope.go:117] "RemoveContainer" containerID="2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" Mar 11 12:19:12 crc kubenswrapper[4816]: E0311 12:19:12.244822 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf\": container with ID starting with 2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf not found: ID does not exist" containerID="2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.244841 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf"} err="failed to get container status \"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf\": rpc error: code = NotFound desc = could not find container \"2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf\": container with ID starting with 2e2e079065db719aeee528343ea5a717b2f18beff4bcac65acd806fa3d456edf not found: ID does not exist" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.473692 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.760421674 podStartE2EDuration="8.473666712s" podCreationTimestamp="2026-03-11 12:19:04 +0000 UTC" firstStartedPulling="2026-03-11 12:19:06.241188346 +0000 UTC m=+1232.832452313" lastFinishedPulling="2026-03-11 12:19:08.954433384 +0000 UTC m=+1235.545697351" observedRunningTime="2026-03-11 12:19:12.197001767 +0000 UTC m=+1238.788265734" watchObservedRunningTime="2026-03-11 12:19:12.473666712 +0000 UTC m=+1239.064930679" Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.474580 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.636834 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:12 crc kubenswrapper[4816]: W0311 12:19:12.648760 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10e3f184_9109_4af7_8ca6_822379e0c513.slice/crio-6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7 WatchSource:0}: Error finding container 6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7: Status 404 returned error can't find the container with id 6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7 Mar 11 12:19:12 crc kubenswrapper[4816]: I0311 12:19:12.973569 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerStarted","Data":"5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2"} Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169197 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerStarted","Data":"8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240"} Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169212 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerStarted","Data":"9de7e47c0f14568909f59552b05e938af6254c4c9840ec07004683a8c3fa16e2"} Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169277 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.169310 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.175456 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7"} Mar 11 12:19:13 crc kubenswrapper[4816]: I0311 12:19:13.202157 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-64b59f8d4-2vxd9" podStartSLOduration=3.202115717 podStartE2EDuration="3.202115717s" podCreationTimestamp="2026-03-11 12:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:13.200325836 +0000 UTC m=+1239.791589833" watchObservedRunningTime="2026-03-11 12:19:13.202115717 +0000 UTC m=+1239.793379684" Mar 11 12:19:14 crc kubenswrapper[4816]: I0311 12:19:14.195849 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf"} Mar 11 12:19:14 crc kubenswrapper[4816]: I0311 12:19:14.197489 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831"} Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.208498 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4"} Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.382627 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.617449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.729115 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:19:15 crc kubenswrapper[4816]: I0311 12:19:15.729372 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="dnsmasq-dns" containerID="cri-o://ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9" gracePeriod=10 Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.092757 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.243498 4816 generic.go:334] "Generic (PLEG): container finished" podID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerID="ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9" exitCode=0 Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.243563 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerDied","Data":"ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9"} Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.307792 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.563352 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708385 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708468 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708549 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708644 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708682 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.708740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") pod \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\" (UID: \"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c\") " Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.752707 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd" (OuterVolumeSpecName: "kube-api-access-mqnqd") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "kube-api-access-mqnqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.813173 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqnqd\" (UniqueName: \"kubernetes.io/projected/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-kube-api-access-mqnqd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.824146 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.835857 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config" (OuterVolumeSpecName: "config") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.851988 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.883813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.896723 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" (UID: "b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.907312 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914726 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914754 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914764 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914775 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:16 crc kubenswrapper[4816]: I0311 12:19:16.914786 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.067588 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.255143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerStarted","Data":"b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3"} Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.256334 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.265487 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.265895 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-2w84f" event={"ID":"b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c","Type":"ContainerDied","Data":"88fbfecb363955a5809ac97d8f060b762763c43823be00b756985a39bffcbe7e"} Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.265937 4816 scope.go:117] "RemoveContainer" containerID="ec1bad5250db6f7f6e52f756f22dc65681cf048e7705d228c0b2aebf5f68f5e9" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.266154 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="cinder-scheduler" containerID="cri-o://cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621" gracePeriod=30 Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.266419 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="probe" containerID="cri-o://e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778" gracePeriod=30 Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.310635 4816 scope.go:117] "RemoveContainer" containerID="f365943c3bcd25f6e7decbae194b1841bae20d3a5ca1816848e3f40bb39b1c41" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.320087 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.215760558 podStartE2EDuration="6.320061203s" podCreationTimestamp="2026-03-11 12:19:11 +0000 UTC" firstStartedPulling="2026-03-11 12:19:12.657058893 +0000 UTC m=+1239.248322860" lastFinishedPulling="2026-03-11 12:19:16.761359538 +0000 UTC m=+1243.352623505" observedRunningTime="2026-03-11 12:19:17.300606777 +0000 UTC m=+1243.891870744" watchObservedRunningTime="2026-03-11 12:19:17.320061203 +0000 UTC m=+1243.911325170" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.341153 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.372197 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-2w84f"] Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.661593 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.933976 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.934421 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64584d7649-mb6k8" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-api" containerID="cri-o://06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658" gracePeriod=30 Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.934496 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64584d7649-mb6k8" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" containerID="cri-o://ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d" gracePeriod=30 Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.947493 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-64584d7649-mb6k8" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": EOF" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.992963 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:19:17 crc kubenswrapper[4816]: E0311 12:19:17.993834 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="init" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.993859 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="init" Mar 11 12:19:17 crc kubenswrapper[4816]: E0311 12:19:17.993880 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="dnsmasq-dns" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.993890 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="dnsmasq-dns" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.994769 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" containerName="dnsmasq-dns" Mar 11 12:19:17 crc kubenswrapper[4816]: I0311 12:19:17.996476 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.017003 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.141069 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c" path="/var/lib/kubelet/pods/b7c430b0-7c5a-4b11-8ec1-d551d9c91d2c/volumes" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144548 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144625 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144718 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144748 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144772 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144799 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.144830 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.246885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.246951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247012 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247047 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247295 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247410 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.247450 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.257377 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.257494 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.257596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.259181 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.259299 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.261416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.271214 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") pod \"neutron-6867c6dbc5-lzgfd\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.324476 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:18 crc kubenswrapper[4816]: I0311 12:19:18.710157 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.100929 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.303693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerStarted","Data":"10129169327e9c40582f9c635a8d87b021f99cc78ac017f7e4f16f40942456bc"} Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.306724 4816 generic.go:334] "Generic (PLEG): container finished" podID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerID="e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778" exitCode=0 Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.306783 4816 generic.go:334] "Generic (PLEG): container finished" podID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerID="cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621" exitCode=0 Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.306832 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerDied","Data":"e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778"} Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.306862 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerDied","Data":"cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621"} Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.309873 4816 generic.go:334] "Generic (PLEG): container finished" podID="bd930e1b-a508-4a64-8825-9800b8010d59" containerID="ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d" exitCode=0 Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.309931 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerDied","Data":"ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d"} Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.759373 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785290 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785357 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785385 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785431 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785460 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.785488 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") pod \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\" (UID: \"a8f76a92-4234-474b-bca2-f5d9cbbec8f2\") " Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.791234 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts" (OuterVolumeSpecName: "scripts") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.791778 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.795831 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf" (OuterVolumeSpecName: "kube-api-access-z98gf") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "kube-api-access-z98gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.802299 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.878558 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887402 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887452 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887463 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z98gf\" (UniqueName: \"kubernetes.io/projected/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-kube-api-access-z98gf\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887476 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.887485 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.924405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data" (OuterVolumeSpecName: "config-data") pod "a8f76a92-4234-474b-bca2-f5d9cbbec8f2" (UID: "a8f76a92-4234-474b-bca2-f5d9cbbec8f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:19 crc kubenswrapper[4816]: I0311 12:19:19.989049 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8f76a92-4234-474b-bca2-f5d9cbbec8f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.327724 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerStarted","Data":"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d"} Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.327850 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.327871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerStarted","Data":"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035"} Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.342550 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a8f76a92-4234-474b-bca2-f5d9cbbec8f2","Type":"ContainerDied","Data":"965649e801e9747f00b8263a5feeac3e050a872f0198fc9e14a40e62b55571cf"} Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.342646 4816 scope.go:117] "RemoveContainer" containerID="e6365a9b05307ab0d08f2be43b5fd09d02f68def8b266305137ad18cede98778" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.342648 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.363557 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6867c6dbc5-lzgfd" podStartSLOduration=3.363537587 podStartE2EDuration="3.363537587s" podCreationTimestamp="2026-03-11 12:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:20.358772151 +0000 UTC m=+1246.950036128" watchObservedRunningTime="2026-03-11 12:19:20.363537587 +0000 UTC m=+1246.954801544" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.364943 4816 generic.go:334] "Generic (PLEG): container finished" podID="bd930e1b-a508-4a64-8825-9800b8010d59" containerID="06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658" exitCode=0 Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.365038 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerDied","Data":"06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658"} Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.418719 4816 scope.go:117] "RemoveContainer" containerID="cd4f48e197b603f62e0bf20ff6682b81e7b6286b2b2f453bd8e109136a0f4621" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.431939 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.439472 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.466712 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:20 crc kubenswrapper[4816]: E0311 12:19:20.467844 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="cinder-scheduler" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.467871 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="cinder-scheduler" Mar 11 12:19:20 crc kubenswrapper[4816]: E0311 12:19:20.467910 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="probe" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.468419 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="probe" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.469143 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="probe" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.469203 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" containerName="cinder-scheduler" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.473032 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.480691 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.491265 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610303 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610384 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610404 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610436 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.610474 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.628364 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711763 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711809 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711856 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711898 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.711997 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.713201 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.721894 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.725543 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.726657 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.732224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.732881 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") pod \"cinder-scheduler-0\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.812962 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813046 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813065 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813136 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813190 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.813322 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") pod \"bd930e1b-a508-4a64-8825-9800b8010d59\" (UID: \"bd930e1b-a508-4a64-8825-9800b8010d59\") " Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.823482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.843468 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x" (OuterVolumeSpecName: "kube-api-access-w6v7x") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "kube-api-access-w6v7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.892497 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.897981 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config" (OuterVolumeSpecName: "config") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.909533 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.910297 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916103 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916142 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916155 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6v7x\" (UniqueName: \"kubernetes.io/projected/bd930e1b-a508-4a64-8825-9800b8010d59-kube-api-access-w6v7x\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916168 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916181 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.916193 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.917078 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:19:20 crc kubenswrapper[4816]: I0311 12:19:20.918726 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bd930e1b-a508-4a64-8825-9800b8010d59" (UID: "bd930e1b-a508-4a64-8825-9800b8010d59"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.018392 4816 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd930e1b-a508-4a64-8825-9800b8010d59-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.379385 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64584d7649-mb6k8" event={"ID":"bd930e1b-a508-4a64-8825-9800b8010d59","Type":"ContainerDied","Data":"0829aed87a841d8c87b4f741cc407293d8e591d9e8b4c02e21e8a61c30445d1f"} Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.379545 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64584d7649-mb6k8" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.379716 4816 scope.go:117] "RemoveContainer" containerID="ed06a5d04ea24da7b7022266f3b93adfbbc7a80293e5752545ee9f6add12458d" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.415154 4816 scope.go:117] "RemoveContainer" containerID="06ebd4a2da9305c5f9303396efc2a80f0ef4ae2462b9e8b47545883c85f3c658" Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.417347 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.429828 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-64584d7649-mb6k8"] Mar 11 12:19:21 crc kubenswrapper[4816]: I0311 12:19:21.502190 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.150628 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f76a92-4234-474b-bca2-f5d9cbbec8f2" path="/var/lib/kubelet/pods/a8f76a92-4234-474b-bca2-f5d9cbbec8f2/volumes" Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.152130 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" path="/var/lib/kubelet/pods/bd930e1b-a508-4a64-8825-9800b8010d59/volumes" Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.401613 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerStarted","Data":"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac"} Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.401666 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerStarted","Data":"883c96453eeb3dc398341c2c3b80a740484d91dd773b0fcfe0237a4112b6097a"} Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.608312 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.666730 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.743355 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.743623 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d4754df76-xnl78" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" containerID="cri-o://5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf" gracePeriod=30 Mar 11 12:19:22 crc kubenswrapper[4816]: I0311 12:19:22.744272 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d4754df76-xnl78" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" containerID="cri-o://eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d" gracePeriod=30 Mar 11 12:19:23 crc kubenswrapper[4816]: I0311 12:19:23.430942 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerStarted","Data":"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8"} Mar 11 12:19:23 crc kubenswrapper[4816]: I0311 12:19:23.442151 4816 generic.go:334] "Generic (PLEG): container finished" podID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerID="5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf" exitCode=143 Mar 11 12:19:23 crc kubenswrapper[4816]: I0311 12:19:23.442223 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerDied","Data":"5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf"} Mar 11 12:19:23 crc kubenswrapper[4816]: I0311 12:19:23.460202 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.460170239 podStartE2EDuration="3.460170239s" podCreationTimestamp="2026-03-11 12:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:23.447810946 +0000 UTC m=+1250.039074933" watchObservedRunningTime="2026-03-11 12:19:23.460170239 +0000 UTC m=+1250.051434196" Mar 11 12:19:25 crc kubenswrapper[4816]: I0311 12:19:25.917949 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.178182 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d4754df76-xnl78" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:42518->10.217.0.161:9311: read: connection reset by peer" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.178198 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d4754df76-xnl78" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:42516->10.217.0.161:9311: read: connection reset by peer" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.519597 4816 generic.go:334] "Generic (PLEG): container finished" podID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerID="eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d" exitCode=0 Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.519667 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerDied","Data":"eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d"} Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.540093 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.727724 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767053 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767141 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767204 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.767714 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs" (OuterVolumeSpecName: "logs") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.768111 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") pod \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\" (UID: \"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd\") " Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.768420 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.791130 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt" (OuterVolumeSpecName: "kube-api-access-fzbbt") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "kube-api-access-fzbbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.791339 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.820785 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.829767 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data" (OuterVolumeSpecName: "config-data") pod "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" (UID: "ba61db44-272d-4f1c-b3c6-d3fe1edb38bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.870208 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.870236 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbbt\" (UniqueName: \"kubernetes.io/projected/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-kube-api-access-fzbbt\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.870267 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:26 crc kubenswrapper[4816]: I0311 12:19:26.870278 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.530201 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d4754df76-xnl78" event={"ID":"ba61db44-272d-4f1c-b3c6-d3fe1edb38bd","Type":"ContainerDied","Data":"9cc4c282c9e0a53abd8b5254615b71e35bd7cba821c5895a1166b86769ee9a4f"} Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.530339 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d4754df76-xnl78" Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.531222 4816 scope.go:117] "RemoveContainer" containerID="eaee8f2b001ecac77cd66b481deeba2cae3b59ceedc01017e976649a89d1fa8d" Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.570057 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.573767 4816 scope.go:117] "RemoveContainer" containerID="5ba8a8c2543ebb94e1b68f6aeb2566f2e416672e55badbfef9432d4a75b3a2bf" Mar 11 12:19:27 crc kubenswrapper[4816]: I0311 12:19:27.581165 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5d4754df76-xnl78"] Mar 11 12:19:28 crc kubenswrapper[4816]: I0311 12:19:28.155418 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" path="/var/lib/kubelet/pods/ba61db44-272d-4f1c-b3c6-d3fe1edb38bd/volumes" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.110987 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 11 12:19:30 crc kubenswrapper[4816]: E0311 12:19:30.111900 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-api" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.111920 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-api" Mar 11 12:19:30 crc kubenswrapper[4816]: E0311 12:19:30.111944 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.111954 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" Mar 11 12:19:30 crc kubenswrapper[4816]: E0311 12:19:30.111968 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.111975 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" Mar 11 12:19:30 crc kubenswrapper[4816]: E0311 12:19:30.112011 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112020 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112241 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112284 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api-log" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112298 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba61db44-272d-4f1c-b3c6-d3fe1edb38bd" containerName="barbican-api" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.112320 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-api" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.113099 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.117719 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.117858 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rwjj4" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.117972 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.127066 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.239979 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.240106 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.240241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.240378 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.342747 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.342897 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.342959 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.343050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.344074 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.351138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.351332 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.362512 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") pod \"openstackclient\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.447080 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 12:19:30 crc kubenswrapper[4816]: I0311 12:19:30.963061 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 12:19:31 crc kubenswrapper[4816]: I0311 12:19:31.207937 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 11 12:19:31 crc kubenswrapper[4816]: I0311 12:19:31.591001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"502b3843-8246-4715-9735-dfc0336caacb","Type":"ContainerStarted","Data":"8e4a3fdbe3614d064cc8bdff8752cfb65321a17270a649131b41201fcc4fda91"} Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.671663 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.675230 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.676932 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.680007 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.680038 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.691110 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.750973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751031 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751061 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751092 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751162 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751200 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.751275 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853003 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853082 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853163 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853191 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853214 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853283 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.853349 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.855218 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.855679 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.861726 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.862840 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.863311 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.863510 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.868130 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.871557 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") pod \"swift-proxy-6c5b6658f-tdgsh\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:34 crc kubenswrapper[4816]: I0311 12:19:34.951147 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:19:35 crc kubenswrapper[4816]: I0311 12:19:35.003465 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:35 crc kubenswrapper[4816]: I0311 12:19:35.690879 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.598805 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.599552 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-central-agent" containerID="cri-o://64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831" gracePeriod=30 Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.599674 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="sg-core" containerID="cri-o://892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4" gracePeriod=30 Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.599674 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" containerID="cri-o://b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3" gracePeriod=30 Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.599692 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-notification-agent" containerID="cri-o://5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf" gracePeriod=30 Mar 11 12:19:36 crc kubenswrapper[4816]: I0311 12:19:36.614894 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.167:3000/\": EOF" Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660165 4816 generic.go:334] "Generic (PLEG): container finished" podID="10e3f184-9109-4af7-8ca6-822379e0c513" containerID="b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3" exitCode=0 Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660480 4816 generic.go:334] "Generic (PLEG): container finished" podID="10e3f184-9109-4af7-8ca6-822379e0c513" containerID="892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4" exitCode=2 Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660490 4816 generic.go:334] "Generic (PLEG): container finished" podID="10e3f184-9109-4af7-8ca6-822379e0c513" containerID="5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf" exitCode=0 Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660499 4816 generic.go:334] "Generic (PLEG): container finished" podID="10e3f184-9109-4af7-8ca6-822379e0c513" containerID="64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831" exitCode=0 Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660265 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3"} Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660546 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4"} Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf"} Mar 11 12:19:37 crc kubenswrapper[4816]: I0311 12:19:37.660575 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831"} Mar 11 12:19:40 crc kubenswrapper[4816]: I0311 12:19:40.694332 4816 generic.go:334] "Generic (PLEG): container finished" podID="43eac2c3-bace-4682-b48e-f063d6653733" containerID="0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193" exitCode=137 Mar 11 12:19:40 crc kubenswrapper[4816]: I0311 12:19:40.694957 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerDied","Data":"0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193"} Mar 11 12:19:40 crc kubenswrapper[4816]: I0311 12:19:40.739366 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.165:8776/healthcheck\": dial tcp 10.217.0.165:8776: connect: connection refused" Mar 11 12:19:41 crc kubenswrapper[4816]: W0311 12:19:41.442868 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6d90d2_e7e3_4245_b3a6_042621e01a67.slice/crio-78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882 WatchSource:0}: Error finding container 78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882: Status 404 returned error can't find the container with id 78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882 Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.518198 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.167:3000/\": dial tcp 10.217.0.167:3000: connect: connection refused" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.713596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerStarted","Data":"78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882"} Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.809313 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938478 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938753 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938902 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.938996 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.939114 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.940556 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.940717 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") pod \"43eac2c3-bace-4682-b48e-f063d6653733\" (UID: \"43eac2c3-bace-4682-b48e-f063d6653733\") " Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.942062 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43eac2c3-bace-4682-b48e-f063d6653733-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.943096 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs" (OuterVolumeSpecName: "logs") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.946197 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.946262 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct" (OuterVolumeSpecName: "kube-api-access-vwzct") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "kube-api-access-vwzct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.954678 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts" (OuterVolumeSpecName: "scripts") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.960019 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:41 crc kubenswrapper[4816]: I0311 12:19:41.980434 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.021771 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data" (OuterVolumeSpecName: "config-data") pod "43eac2c3-bace-4682-b48e-f063d6653733" (UID: "43eac2c3-bace-4682-b48e-f063d6653733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043903 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043945 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043956 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwzct\" (UniqueName: \"kubernetes.io/projected/43eac2c3-bace-4682-b48e-f063d6653733-kube-api-access-vwzct\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043972 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043982 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eac2c3-bace-4682-b48e-f063d6653733-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.043995 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eac2c3-bace-4682-b48e-f063d6653733-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.145606 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.145712 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.145804 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.145965 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.146013 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.146043 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.146107 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") pod \"10e3f184-9109-4af7-8ca6-822379e0c513\" (UID: \"10e3f184-9109-4af7-8ca6-822379e0c513\") " Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.147906 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.151496 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml" (OuterVolumeSpecName: "kube-api-access-hsbml") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "kube-api-access-hsbml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.154184 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.155362 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts" (OuterVolumeSpecName: "scripts") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.196433 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.230791 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248313 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248356 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10e3f184-9109-4af7-8ca6-822379e0c513-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248369 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsbml\" (UniqueName: \"kubernetes.io/projected/10e3f184-9109-4af7-8ca6-822379e0c513-kube-api-access-hsbml\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248384 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248394 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.248406 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.255797 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data" (OuterVolumeSpecName: "config-data") pod "10e3f184-9109-4af7-8ca6-822379e0c513" (UID: "10e3f184-9109-4af7-8ca6-822379e0c513"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.350750 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e3f184-9109-4af7-8ca6-822379e0c513-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.732982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerStarted","Data":"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.733054 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerStarted","Data":"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.733111 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.733140 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.738658 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"43eac2c3-bace-4682-b48e-f063d6653733","Type":"ContainerDied","Data":"d1d101cb43433bc7eb7c833f258e91530ee7e5c09a0712cf4851d690643adb2a"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.738722 4816 scope.go:117] "RemoveContainer" containerID="0dc3816fea03c51cbbb58023865a3dee996cbbc76475be49172b8d011f579193" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.738929 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.753419 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"502b3843-8246-4715-9735-dfc0336caacb","Type":"ContainerStarted","Data":"fd6533a10f6d22b4d1d7a2a73ad8cc4591438b77aefeced48dbf3b4526cf28f0"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.758743 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10e3f184-9109-4af7-8ca6-822379e0c513","Type":"ContainerDied","Data":"6d281821384131e14e507eeaf976f8558feb01527e87cd3779946b65388e3bc7"} Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.758785 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.767992 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6c5b6658f-tdgsh" podStartSLOduration=8.767969468 podStartE2EDuration="8.767969468s" podCreationTimestamp="2026-03-11 12:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:42.762115191 +0000 UTC m=+1269.353379158" watchObservedRunningTime="2026-03-11 12:19:42.767969468 +0000 UTC m=+1269.359233435" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.783706 4816 scope.go:117] "RemoveContainer" containerID="346170c8c6b811872540539f7b2570fc326b6427186b6e7d7e167645153015dd" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.833738 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.250539474 podStartE2EDuration="12.833714517s" podCreationTimestamp="2026-03-11 12:19:30 +0000 UTC" firstStartedPulling="2026-03-11 12:19:30.970530209 +0000 UTC m=+1257.561794176" lastFinishedPulling="2026-03-11 12:19:41.553705252 +0000 UTC m=+1268.144969219" observedRunningTime="2026-03-11 12:19:42.795886106 +0000 UTC m=+1269.387150113" watchObservedRunningTime="2026-03-11 12:19:42.833714517 +0000 UTC m=+1269.424978484" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.841764 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.854242 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.857747 4816 scope.go:117] "RemoveContainer" containerID="b0f2ba98772ce0d4c1de918f6b5eca0c46d92b8201207a117fc19c82c71e70f3" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.873524 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874078 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api-log" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874102 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api-log" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874114 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874122 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874166 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-notification-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874173 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-notification-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874184 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-central-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874190 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-central-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874200 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="sg-core" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874207 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="sg-core" Mar 11 12:19:42 crc kubenswrapper[4816]: E0311 12:19:42.874223 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874230 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874431 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874446 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="proxy-httpd" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874457 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="sg-core" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874468 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-notification-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874480 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" containerName="ceilometer-central-agent" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.874491 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="43eac2c3-bace-4682-b48e-f063d6653733" containerName="cinder-api-log" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.875624 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.877861 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.878030 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.880749 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.896771 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.906338 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.917901 4816 scope.go:117] "RemoveContainer" containerID="892ed54ab6b1b8e78f2c10457a1ac792f459dfcc72db435ed64164634c50c4f4" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.954657 4816 scope.go:117] "RemoveContainer" containerID="5ea55c5fdec26a804e311808a0dab722dc704515cc19343dfae8f51e1980dcdf" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.960731 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964325 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964470 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964502 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964573 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964627 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964730 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964767 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964829 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.964859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.979401 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.994812 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.995325 4816 scope.go:117] "RemoveContainer" containerID="64d1dc2db1be47dc15a33d606bf556173a42132151a3b69e28ce73757040e831" Mar 11 12:19:42 crc kubenswrapper[4816]: I0311 12:19:42.998161 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:42.998846 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.002529 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069107 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069446 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069470 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069514 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069539 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069556 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069583 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.069668 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.070313 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.070687 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.077068 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.079546 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.093329 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.102341 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.102696 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.103553 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.103635 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") pod \"cinder-api-0\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.174946 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175026 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175125 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175156 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175283 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.175313 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.203485 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276647 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276691 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276767 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276783 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276854 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.276874 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.281981 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.282332 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.282402 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.287870 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.288287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.288616 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.312083 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") pod \"ceilometer-0\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.312644 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.837951 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:19:43 crc kubenswrapper[4816]: W0311 12:19:43.849490 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c94c19c_3ccb_43cc_ab41_92baa3141f73.slice/crio-601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3 WatchSource:0}: Error finding container 601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3: Status 404 returned error can't find the container with id 601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3 Mar 11 12:19:43 crc kubenswrapper[4816]: W0311 12:19:43.949670 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe1c6061_c54b_4bd7_bcff_1a0047599189.slice/crio-bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554 WatchSource:0}: Error finding container bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554: Status 404 returned error can't find the container with id bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554 Mar 11 12:19:43 crc kubenswrapper[4816]: I0311 12:19:43.951791 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.101461 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.102757 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.115699 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.186540 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e3f184-9109-4af7-8ca6-822379e0c513" path="/var/lib/kubelet/pods/10e3f184-9109-4af7-8ca6-822379e0c513/volumes" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.188052 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43eac2c3-bace-4682-b48e-f063d6653733" path="/var/lib/kubelet/pods/43eac2c3-bace-4682-b48e-f063d6653733/volumes" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.201030 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.201104 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.245549 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.247179 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.258187 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.303424 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.303483 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.305330 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.312081 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.313497 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.319232 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.332165 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") pod \"nova-api-db-create-zv62x\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.334683 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.406133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.406188 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.406229 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.406277 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.432403 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.434001 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.448095 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.459898 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508055 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508124 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508170 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508203 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508238 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.508319 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.512621 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.513534 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.525616 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.527275 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.532058 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.543162 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") pod \"nova-cell0-db-create-4z7mr\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.545500 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") pod \"nova-api-91ce-account-create-update-n8mz8\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.572181 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.579387 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610149 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610216 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610277 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610335 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.610998 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.640850 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.641873 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") pod \"nova-cell1-db-create-txccq\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.714173 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.714318 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.715449 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.730457 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.731929 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.743002 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.746444 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.761849 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.784953 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") pod \"nova-cell0-53ba-account-create-update-2vf2k\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.817279 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.817415 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.835274 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerStarted","Data":"601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3"} Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.860114 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554"} Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.919195 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.919386 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.920568 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.921155 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:44 crc kubenswrapper[4816]: I0311 12:19:44.943501 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") pod \"nova-cell1-377b-account-create-update-gb4b2\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.125475 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.130183 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.286688 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:19:45 crc kubenswrapper[4816]: W0311 12:19:45.305952 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ec4faaf_e219_4b01_b3b9_0d6757a38154.slice/crio-2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38 WatchSource:0}: Error finding container 2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38: Status 404 returned error can't find the container with id 2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38 Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.359509 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.454891 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:19:45 crc kubenswrapper[4816]: W0311 12:19:45.585427 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c9952da_6281_45f2_8b45_30caa27b8d39.slice/crio-98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399 WatchSource:0}: Error finding container 98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399: Status 404 returned error can't find the container with id 98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399 Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.631959 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.744548 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:19:45 crc kubenswrapper[4816]: W0311 12:19:45.776154 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod403fec7f_c194_4bdd_a620_34aefb5d677c.slice/crio-0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad WatchSource:0}: Error finding container 0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad: Status 404 returned error can't find the container with id 0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.883557 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" event={"ID":"403fec7f-c194-4bdd-a620-34aefb5d677c","Type":"ContainerStarted","Data":"0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.889179 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-txccq" event={"ID":"7c9952da-6281-45f2-8b45-30caa27b8d39","Type":"ContainerStarted","Data":"98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.896737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerStarted","Data":"691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.918471 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-91ce-account-create-update-n8mz8" event={"ID":"35fe8af0-2f02-4d81-ae03-9d399900494c","Type":"ContainerStarted","Data":"0f5d94dc5d9bb04750c9b3d2e89fcd5a5d6e20ec2f4a19899cb047b2927291a4"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.918532 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-91ce-account-create-update-n8mz8" event={"ID":"35fe8af0-2f02-4d81-ae03-9d399900494c","Type":"ContainerStarted","Data":"0cb78380f29d4d52692e33442f665655f581626d9173b5bf157abd8b1bb91034"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.928129 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" event={"ID":"fee1eb20-6fbe-4e59-a434-54c2e8a6165d","Type":"ContainerStarted","Data":"34decf19ef7bac29bb5073f92442b233e2f6b57a40f58b92a1d00cba4bde5c43"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.938740 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-91ce-account-create-update-n8mz8" podStartSLOduration=1.938721559 podStartE2EDuration="1.938721559s" podCreationTimestamp="2026-03-11 12:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:45.936681641 +0000 UTC m=+1272.527945608" watchObservedRunningTime="2026-03-11 12:19:45.938721559 +0000 UTC m=+1272.529985526" Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.939080 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.954053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4z7mr" event={"ID":"1ec4faaf-e219-4b01-b3b9-0d6757a38154","Type":"ContainerStarted","Data":"fd551dbdb54bb8de807a245da392a1fc03bca9f397e581b66063faafeaf38a5f"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.954432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4z7mr" event={"ID":"1ec4faaf-e219-4b01-b3b9-0d6757a38154","Type":"ContainerStarted","Data":"2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.957880 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zv62x" event={"ID":"a0e0ff63-3d12-4174-9341-ceb21109e000","Type":"ContainerStarted","Data":"cbfbf586e19291c8ee373bf860029353b3f56429b7bf9015d736b9982aa4797f"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.958001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zv62x" event={"ID":"a0e0ff63-3d12-4174-9341-ceb21109e000","Type":"ContainerStarted","Data":"671c06c98bca09ed2d0cbf96fe51a512291245bc7bc97f794c21bccc6c6c997a"} Mar 11 12:19:45 crc kubenswrapper[4816]: I0311 12:19:45.988099 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-4z7mr" podStartSLOduration=1.988078869 podStartE2EDuration="1.988078869s" podCreationTimestamp="2026-03-11 12:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:45.975059787 +0000 UTC m=+1272.566323754" watchObservedRunningTime="2026-03-11 12:19:45.988078869 +0000 UTC m=+1272.579342836" Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.166049 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.974017 4816 generic.go:334] "Generic (PLEG): container finished" podID="35fe8af0-2f02-4d81-ae03-9d399900494c" containerID="0f5d94dc5d9bb04750c9b3d2e89fcd5a5d6e20ec2f4a19899cb047b2927291a4" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.974115 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-91ce-account-create-update-n8mz8" event={"ID":"35fe8af0-2f02-4d81-ae03-9d399900494c","Type":"ContainerDied","Data":"0f5d94dc5d9bb04750c9b3d2e89fcd5a5d6e20ec2f4a19899cb047b2927291a4"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.976175 4816 generic.go:334] "Generic (PLEG): container finished" podID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" containerID="2ba25af6bbe93bf77e8ed2bed1866df9a0d1cdcadbd32ffc70070db8155b1914" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.976267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" event={"ID":"fee1eb20-6fbe-4e59-a434-54c2e8a6165d","Type":"ContainerDied","Data":"2ba25af6bbe93bf77e8ed2bed1866df9a0d1cdcadbd32ffc70070db8155b1914"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.978652 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.980665 4816 generic.go:334] "Generic (PLEG): container finished" podID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" containerID="fd551dbdb54bb8de807a245da392a1fc03bca9f397e581b66063faafeaf38a5f" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.980713 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4z7mr" event={"ID":"1ec4faaf-e219-4b01-b3b9-0d6757a38154","Type":"ContainerDied","Data":"fd551dbdb54bb8de807a245da392a1fc03bca9f397e581b66063faafeaf38a5f"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.982752 4816 generic.go:334] "Generic (PLEG): container finished" podID="a0e0ff63-3d12-4174-9341-ceb21109e000" containerID="cbfbf586e19291c8ee373bf860029353b3f56429b7bf9015d736b9982aa4797f" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.982794 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zv62x" event={"ID":"a0e0ff63-3d12-4174-9341-ceb21109e000","Type":"ContainerDied","Data":"cbfbf586e19291c8ee373bf860029353b3f56429b7bf9015d736b9982aa4797f"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.986106 4816 generic.go:334] "Generic (PLEG): container finished" podID="403fec7f-c194-4bdd-a620-34aefb5d677c" containerID="090174f400ae3d182bc1e17d475eb20c26198249c703a798e2b253812bea946b" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.986163 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" event={"ID":"403fec7f-c194-4bdd-a620-34aefb5d677c","Type":"ContainerDied","Data":"090174f400ae3d182bc1e17d475eb20c26198249c703a798e2b253812bea946b"} Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.987343 4816 generic.go:334] "Generic (PLEG): container finished" podID="7c9952da-6281-45f2-8b45-30caa27b8d39" containerID="6a15e8693d1f25cf8eeefb7b013bbcd57f9676d5cee6b31111e7f71f5ea2e5ca" exitCode=0 Mar 11 12:19:46 crc kubenswrapper[4816]: I0311 12:19:46.987382 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-txccq" event={"ID":"7c9952da-6281-45f2-8b45-30caa27b8d39","Type":"ContainerDied","Data":"6a15e8693d1f25cf8eeefb7b013bbcd57f9676d5cee6b31111e7f71f5ea2e5ca"} Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.000215 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerStarted","Data":"c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06"} Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.000561 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.207930 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.207911215 podStartE2EDuration="5.207911215s" podCreationTimestamp="2026-03-11 12:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:19:47.15070207 +0000 UTC m=+1273.741966037" watchObservedRunningTime="2026-03-11 12:19:47.207911215 +0000 UTC m=+1273.799175172" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.665837 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.742195 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") pod \"a0e0ff63-3d12-4174-9341-ceb21109e000\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.742278 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") pod \"a0e0ff63-3d12-4174-9341-ceb21109e000\" (UID: \"a0e0ff63-3d12-4174-9341-ceb21109e000\") " Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.742838 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0e0ff63-3d12-4174-9341-ceb21109e000" (UID: "a0e0ff63-3d12-4174-9341-ceb21109e000"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.749790 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2" (OuterVolumeSpecName: "kube-api-access-lpzw2") pod "a0e0ff63-3d12-4174-9341-ceb21109e000" (UID: "a0e0ff63-3d12-4174-9341-ceb21109e000"). InnerVolumeSpecName "kube-api-access-lpzw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.843817 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e0ff63-3d12-4174-9341-ceb21109e000-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:47 crc kubenswrapper[4816]: I0311 12:19:47.843865 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpzw2\" (UniqueName: \"kubernetes.io/projected/a0e0ff63-3d12-4174-9341-ceb21109e000-kube-api-access-lpzw2\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.013052 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zv62x" event={"ID":"a0e0ff63-3d12-4174-9341-ceb21109e000","Type":"ContainerDied","Data":"671c06c98bca09ed2d0cbf96fe51a512291245bc7bc97f794c21bccc6c6c997a"} Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.013113 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671c06c98bca09ed2d0cbf96fe51a512291245bc7bc97f794c21bccc6c6c997a" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.014451 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zv62x" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.016338 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6"} Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.352716 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.436883 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.437307 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9df8757bb-rzb52" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-api" containerID="cri-o://385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7" gracePeriod=30 Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.437732 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9df8757bb-rzb52" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-httpd" containerID="cri-o://3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2" gracePeriod=30 Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.535621 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.560115 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") pod \"7c9952da-6281-45f2-8b45-30caa27b8d39\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.560179 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") pod \"7c9952da-6281-45f2-8b45-30caa27b8d39\" (UID: \"7c9952da-6281-45f2-8b45-30caa27b8d39\") " Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.561501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c9952da-6281-45f2-8b45-30caa27b8d39" (UID: "7c9952da-6281-45f2-8b45-30caa27b8d39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.568338 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl" (OuterVolumeSpecName: "kube-api-access-sb6tl") pod "7c9952da-6281-45f2-8b45-30caa27b8d39" (UID: "7c9952da-6281-45f2-8b45-30caa27b8d39"). InnerVolumeSpecName "kube-api-access-sb6tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.666161 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c9952da-6281-45f2-8b45-30caa27b8d39-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.666197 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6tl\" (UniqueName: \"kubernetes.io/projected/7c9952da-6281-45f2-8b45-30caa27b8d39-kube-api-access-sb6tl\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.963864 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.976118 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:48 crc kubenswrapper[4816]: I0311 12:19:48.984306 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.009888 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.054679 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" event={"ID":"fee1eb20-6fbe-4e59-a434-54c2e8a6165d","Type":"ContainerDied","Data":"34decf19ef7bac29bb5073f92442b233e2f6b57a40f58b92a1d00cba4bde5c43"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.054730 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34decf19ef7bac29bb5073f92442b233e2f6b57a40f58b92a1d00cba4bde5c43" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.054807 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-53ba-account-create-update-2vf2k" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.079420 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4z7mr" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.079914 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") pod \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.080070 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4z7mr" event={"ID":"1ec4faaf-e219-4b01-b3b9-0d6757a38154","Type":"ContainerDied","Data":"2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.080113 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f56abcf45077723001cd758c3b00fda8dcf2b28cf4f89c65baa6a6b4cfb7a38" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.080317 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") pod \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\" (UID: \"1ec4faaf-e219-4b01-b3b9-0d6757a38154\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.082433 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ec4faaf-e219-4b01-b3b9-0d6757a38154" (UID: "1ec4faaf-e219-4b01-b3b9-0d6757a38154"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.090956 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" event={"ID":"403fec7f-c194-4bdd-a620-34aefb5d677c","Type":"ContainerDied","Data":"0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.091018 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0429ff5f884160565a9cda58cf166816738777ca43ea5e47ebac9e8d47d354ad" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.091148 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-377b-account-create-update-gb4b2" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.093152 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5" (OuterVolumeSpecName: "kube-api-access-vrnr5") pod "1ec4faaf-e219-4b01-b3b9-0d6757a38154" (UID: "1ec4faaf-e219-4b01-b3b9-0d6757a38154"). InnerVolumeSpecName "kube-api-access-vrnr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.099068 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-txccq" event={"ID":"7c9952da-6281-45f2-8b45-30caa27b8d39","Type":"ContainerDied","Data":"98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.099120 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ee6dc5ae584921adf996e86778c6141a6b8e7df5c376e7181885193ecb1399" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.099227 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-txccq" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.105505 4816 generic.go:334] "Generic (PLEG): container finished" podID="68498f16-b5c3-4960-8565-7ae628fc3122" containerID="3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2" exitCode=0 Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.105623 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerDied","Data":"3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.107224 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-91ce-account-create-update-n8mz8" event={"ID":"35fe8af0-2f02-4d81-ae03-9d399900494c","Type":"ContainerDied","Data":"0cb78380f29d4d52692e33442f665655f581626d9173b5bf157abd8b1bb91034"} Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.107282 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cb78380f29d4d52692e33442f665655f581626d9173b5bf157abd8b1bb91034" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.107346 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-91ce-account-create-update-n8mz8" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182215 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") pod \"403fec7f-c194-4bdd-a620-34aefb5d677c\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182475 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") pod \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") pod \"35fe8af0-2f02-4d81-ae03-9d399900494c\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") pod \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\" (UID: \"fee1eb20-6fbe-4e59-a434-54c2e8a6165d\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182705 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") pod \"403fec7f-c194-4bdd-a620-34aefb5d677c\" (UID: \"403fec7f-c194-4bdd-a620-34aefb5d677c\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.182728 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") pod \"35fe8af0-2f02-4d81-ae03-9d399900494c\" (UID: \"35fe8af0-2f02-4d81-ae03-9d399900494c\") " Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183035 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35fe8af0-2f02-4d81-ae03-9d399900494c" (UID: "35fe8af0-2f02-4d81-ae03-9d399900494c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183071 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fee1eb20-6fbe-4e59-a434-54c2e8a6165d" (UID: "fee1eb20-6fbe-4e59-a434-54c2e8a6165d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183115 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "403fec7f-c194-4bdd-a620-34aefb5d677c" (UID: "403fec7f-c194-4bdd-a620-34aefb5d677c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183492 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/403fec7f-c194-4bdd-a620-34aefb5d677c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183600 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrnr5\" (UniqueName: \"kubernetes.io/projected/1ec4faaf-e219-4b01-b3b9-0d6757a38154-kube-api-access-vrnr5\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183621 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec4faaf-e219-4b01-b3b9-0d6757a38154-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183631 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.183642 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35fe8af0-2f02-4d81-ae03-9d399900494c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.187178 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8" (OuterVolumeSpecName: "kube-api-access-zjhm8") pod "fee1eb20-6fbe-4e59-a434-54c2e8a6165d" (UID: "fee1eb20-6fbe-4e59-a434-54c2e8a6165d"). InnerVolumeSpecName "kube-api-access-zjhm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.187226 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l" (OuterVolumeSpecName: "kube-api-access-dwq9l") pod "403fec7f-c194-4bdd-a620-34aefb5d677c" (UID: "403fec7f-c194-4bdd-a620-34aefb5d677c"). InnerVolumeSpecName "kube-api-access-dwq9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.188526 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt" (OuterVolumeSpecName: "kube-api-access-952tt") pod "35fe8af0-2f02-4d81-ae03-9d399900494c" (UID: "35fe8af0-2f02-4d81-ae03-9d399900494c"). InnerVolumeSpecName "kube-api-access-952tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.286135 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwq9l\" (UniqueName: \"kubernetes.io/projected/403fec7f-c194-4bdd-a620-34aefb5d677c-kube-api-access-dwq9l\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.286175 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-952tt\" (UniqueName: \"kubernetes.io/projected/35fe8af0-2f02-4d81-ae03-9d399900494c-kube-api-access-952tt\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:49 crc kubenswrapper[4816]: I0311 12:19:49.286187 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjhm8\" (UniqueName: \"kubernetes.io/projected/fee1eb20-6fbe-4e59-a434-54c2e8a6165d-kube-api-access-zjhm8\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.017154 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.025259 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerStarted","Data":"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a"} Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124791 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-central-agent" containerID="cri-o://17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" gracePeriod=30 Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124829 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="proxy-httpd" containerID="cri-o://e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" gracePeriod=30 Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124914 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="sg-core" containerID="cri-o://0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" gracePeriod=30 Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.124973 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-notification-agent" containerID="cri-o://967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" gracePeriod=30 Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.179741 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.980110999 podStartE2EDuration="8.179710721s" podCreationTimestamp="2026-03-11 12:19:42 +0000 UTC" firstStartedPulling="2026-03-11 12:19:43.952563467 +0000 UTC m=+1270.543827434" lastFinishedPulling="2026-03-11 12:19:49.152163179 +0000 UTC m=+1275.743427156" observedRunningTime="2026-03-11 12:19:50.166276667 +0000 UTC m=+1276.757540634" watchObservedRunningTime="2026-03-11 12:19:50.179710721 +0000 UTC m=+1276.770974688" Mar 11 12:19:50 crc kubenswrapper[4816]: I0311 12:19:50.490482 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-64584d7649-mb6k8" podUID="bd930e1b-a508-4a64-8825-9800b8010d59" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152619 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerID="e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" exitCode=0 Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152671 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerID="0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" exitCode=2 Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152685 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerID="967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" exitCode=0 Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152714 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a"} Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152750 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6"} Mar 11 12:19:51 crc kubenswrapper[4816]: I0311 12:19:51.152765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c"} Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.631188 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670180 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670273 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670394 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670439 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670563 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670673 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.670726 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") pod \"fe1c6061-c54b-4bd7-bcff-1a0047599189\" (UID: \"fe1c6061-c54b-4bd7-bcff-1a0047599189\") " Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.671142 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.671782 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.679458 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p" (OuterVolumeSpecName: "kube-api-access-gh28p") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "kube-api-access-gh28p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.684789 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.688594 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts" (OuterVolumeSpecName: "scripts") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.739780 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.774324 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.774363 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.774375 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh28p\" (UniqueName: \"kubernetes.io/projected/fe1c6061-c54b-4bd7-bcff-1a0047599189-kube-api-access-gh28p\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.774388 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe1c6061-c54b-4bd7-bcff-1a0047599189-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.777951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.789349 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data" (OuterVolumeSpecName: "config-data") pod "fe1c6061-c54b-4bd7-bcff-1a0047599189" (UID: "fe1c6061-c54b-4bd7-bcff-1a0047599189"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.877340 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:52 crc kubenswrapper[4816]: I0311 12:19:52.877732 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1c6061-c54b-4bd7-bcff-1a0047599189-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.175956 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerID="17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" exitCode=0 Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.176030 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8"} Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.176087 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe1c6061-c54b-4bd7-bcff-1a0047599189","Type":"ContainerDied","Data":"bf43034a6d989e03fbb9afda66ffef8a89a7703f9ab28abf0ae391e957eb6554"} Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.176113 4816 scope.go:117] "RemoveContainer" containerID="e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.176365 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.216178 4816 scope.go:117] "RemoveContainer" containerID="0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.245644 4816 scope.go:117] "RemoveContainer" containerID="967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.252617 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.271560 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.278821 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279291 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="sg-core" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279308 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="sg-core" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279326 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-central-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279333 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-central-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279345 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fe8af0-2f02-4d81-ae03-9d399900494c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279351 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fe8af0-2f02-4d81-ae03-9d399900494c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279364 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-notification-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279370 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-notification-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279384 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403fec7f-c194-4bdd-a620-34aefb5d677c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279389 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="403fec7f-c194-4bdd-a620-34aefb5d677c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279398 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e0ff63-3d12-4174-9341-ceb21109e000" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279403 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e0ff63-3d12-4174-9341-ceb21109e000" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279433 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9952da-6281-45f2-8b45-30caa27b8d39" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279439 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9952da-6281-45f2-8b45-30caa27b8d39" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279448 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279454 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279466 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279471 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.279483 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="proxy-httpd" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279489 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="proxy-httpd" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279660 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="403fec7f-c194-4bdd-a620-34aefb5d677c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279677 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="sg-core" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279686 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279696 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-central-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279705 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e0ff63-3d12-4174-9341-ceb21109e000" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279716 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c9952da-6281-45f2-8b45-30caa27b8d39" containerName="mariadb-database-create" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279725 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="ceilometer-notification-agent" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279736 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fe8af0-2f02-4d81-ae03-9d399900494c" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279751 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" containerName="proxy-httpd" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.279756 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" containerName="mariadb-account-create-update" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.281684 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.289688 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.289703 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.343960 4816 scope.go:117] "RemoveContainer" containerID="17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.355668 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390123 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390160 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390208 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390282 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390317 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390344 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.390686 4816 scope.go:117] "RemoveContainer" containerID="e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.391468 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a\": container with ID starting with e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a not found: ID does not exist" containerID="e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.391559 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a"} err="failed to get container status \"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a\": rpc error: code = NotFound desc = could not find container \"e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a\": container with ID starting with e19bf99d556ade6a51b374ef26b342f20bf8f351b160b3dc04c013402591b75a not found: ID does not exist" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.391604 4816 scope.go:117] "RemoveContainer" containerID="0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.392795 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6\": container with ID starting with 0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6 not found: ID does not exist" containerID="0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.392824 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6"} err="failed to get container status \"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6\": rpc error: code = NotFound desc = could not find container \"0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6\": container with ID starting with 0d74adf6865fc807216c0f784e71de5b04800932ef37cf1367ed2f124fa6bff6 not found: ID does not exist" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.392843 4816 scope.go:117] "RemoveContainer" containerID="967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.393466 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c\": container with ID starting with 967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c not found: ID does not exist" containerID="967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.393503 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c"} err="failed to get container status \"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c\": rpc error: code = NotFound desc = could not find container \"967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c\": container with ID starting with 967e47ce92a092604089ebcbd14d83152d557d03654e565fcc9256588c7b557c not found: ID does not exist" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.393530 4816 scope.go:117] "RemoveContainer" containerID="17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" Mar 11 12:19:53 crc kubenswrapper[4816]: E0311 12:19:53.398422 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8\": container with ID starting with 17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8 not found: ID does not exist" containerID="17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.398526 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8"} err="failed to get container status \"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8\": rpc error: code = NotFound desc = could not find container \"17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8\": container with ID starting with 17d113833184841a8ffdf30f6cf9f881d5ed7cf51f87234b41d0a9c017bb1de8 not found: ID does not exist" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492026 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492107 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492149 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492204 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492230 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492268 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.492313 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.493025 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.493156 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.498579 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.500285 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.500993 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.516225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.522283 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") pod \"ceilometer-0\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " pod="openstack/ceilometer-0" Mar 11 12:19:53 crc kubenswrapper[4816]: I0311 12:19:53.661687 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.145256 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe1c6061-c54b-4bd7-bcff-1a0047599189" path="/var/lib/kubelet/pods/fe1c6061-c54b-4bd7-bcff-1a0047599189/volumes" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.195789 4816 generic.go:334] "Generic (PLEG): container finished" podID="68498f16-b5c3-4960-8565-7ae628fc3122" containerID="385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7" exitCode=0 Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.195856 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerDied","Data":"385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7"} Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.209501 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.266714 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.326154 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.326706 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.327435 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.327586 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.327849 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") pod \"68498f16-b5c3-4960-8565-7ae628fc3122\" (UID: \"68498f16-b5c3-4960-8565-7ae628fc3122\") " Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.337876 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.348493 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks" (OuterVolumeSpecName: "kube-api-access-9clks") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "kube-api-access-9clks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.410006 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config" (OuterVolumeSpecName: "config") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.417416 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.432288 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.432341 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.432352 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.432361 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9clks\" (UniqueName: \"kubernetes.io/projected/68498f16-b5c3-4960-8565-7ae628fc3122-kube-api-access-9clks\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.452364 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "68498f16-b5c3-4960-8565-7ae628fc3122" (UID: "68498f16-b5c3-4960-8565-7ae628fc3122"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:19:54 crc kubenswrapper[4816]: I0311 12:19:54.535157 4816 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68498f16-b5c3-4960-8565-7ae628fc3122-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105059 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:19:55 crc kubenswrapper[4816]: E0311 12:19:55.105581 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-httpd" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105595 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-httpd" Mar 11 12:19:55 crc kubenswrapper[4816]: E0311 12:19:55.105604 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-api" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105611 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-api" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105838 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-httpd" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.105856 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" containerName="neutron-api" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.106618 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.112556 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.113018 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.116321 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f2q4d" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.147876 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.148036 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.148943 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.149011 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.149436 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.208588 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9df8757bb-rzb52" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.208617 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9df8757bb-rzb52" event={"ID":"68498f16-b5c3-4960-8565-7ae628fc3122","Type":"ContainerDied","Data":"ca48397e5444728848156fabb2c1b9060ca19d57a1c1905996ce53cd9a54fc09"} Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.208698 4816 scope.go:117] "RemoveContainer" containerID="3a4b8f5199cb2db96176f7d26ac1288036fcf9dd3deb012c7c6cb2bd6febc6c2" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.210744 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"bab09cd01583eebdccfb229a37532b6f0674000f5d1606f07d42c8adaa348948"} Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.251586 4816 scope.go:117] "RemoveContainer" containerID="385c6a6a7483bf3ffb2a31553a973012c1161303ce29917595a5f314788786f7" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.252284 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.252504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.252659 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.252912 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.259637 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.263038 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.263038 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.265627 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.273602 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") pod \"nova-cell0-conductor-db-sync-r2t5s\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.293330 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9df8757bb-rzb52"] Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.435132 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:19:55 crc kubenswrapper[4816]: I0311 12:19:55.552101 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 11 12:19:56 crc kubenswrapper[4816]: W0311 12:19:56.055458 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6268fe92_5c93_43c7_95bc_f30befda5d65.slice/crio-1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e WatchSource:0}: Error finding container 1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e: Status 404 returned error can't find the container with id 1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e Mar 11 12:19:56 crc kubenswrapper[4816]: I0311 12:19:56.057872 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:19:56 crc kubenswrapper[4816]: I0311 12:19:56.142512 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68498f16-b5c3-4960-8565-7ae628fc3122" path="/var/lib/kubelet/pods/68498f16-b5c3-4960-8565-7ae628fc3122/volumes" Mar 11 12:19:56 crc kubenswrapper[4816]: I0311 12:19:56.225133 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f"} Mar 11 12:19:56 crc kubenswrapper[4816]: I0311 12:19:56.226931 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" event={"ID":"6268fe92-5c93-43c7-95bc-f30befda5d65","Type":"ContainerStarted","Data":"1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e"} Mar 11 12:19:58 crc kubenswrapper[4816]: I0311 12:19:58.119373 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:19:58 crc kubenswrapper[4816]: I0311 12:19:58.254678 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd"} Mar 11 12:19:58 crc kubenswrapper[4816]: I0311 12:19:58.254749 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641"} Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.143872 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.146083 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.149462 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.149786 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.150018 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.152057 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.295151 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") pod \"auto-csr-approver-29553860-9kp4n\" (UID: \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\") " pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.397987 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") pod \"auto-csr-approver-29553860-9kp4n\" (UID: \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\") " pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.430574 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") pod \"auto-csr-approver-29553860-9kp4n\" (UID: \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\") " pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:00 crc kubenswrapper[4816]: I0311 12:20:00.467220 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:02 crc kubenswrapper[4816]: I0311 12:20:02.629683 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:02 crc kubenswrapper[4816]: I0311 12:20:02.630571 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-log" containerID="cri-o://4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67" gracePeriod=30 Mar 11 12:20:02 crc kubenswrapper[4816]: I0311 12:20:02.631281 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-httpd" containerID="cri-o://ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea" gracePeriod=30 Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.325998 4816 generic.go:334] "Generic (PLEG): container finished" podID="439b686e-927d-425a-a218-807220ae1e95" containerID="4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67" exitCode=143 Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.326073 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerDied","Data":"4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67"} Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.557107 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.558590 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-httpd" containerID="cri-o://44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" gracePeriod=30 Mar 11 12:20:03 crc kubenswrapper[4816]: I0311 12:20:03.559069 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-log" containerID="cri-o://d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" gracePeriod=30 Mar 11 12:20:03 crc kubenswrapper[4816]: E0311 12:20:03.781421 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9d3606c_b28d_4028_93fc_535afa127cd6.slice/crio-conmon-d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:20:04 crc kubenswrapper[4816]: I0311 12:20:04.352508 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerID="d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" exitCode=143 Mar 11 12:20:04 crc kubenswrapper[4816]: I0311 12:20:04.352651 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerDied","Data":"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b"} Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.382902 4816 generic.go:334] "Generic (PLEG): container finished" podID="439b686e-927d-425a-a218-807220ae1e95" containerID="ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea" exitCode=0 Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.383053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerDied","Data":"ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea"} Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.633083 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.725374 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753181 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753234 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753289 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753407 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753594 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753622 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.753695 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") pod \"439b686e-927d-425a-a218-807220ae1e95\" (UID: \"439b686e-927d-425a-a218-807220ae1e95\") " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.754573 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.756635 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs" (OuterVolumeSpecName: "logs") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.770129 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts" (OuterVolumeSpecName: "scripts") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.772719 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp" (OuterVolumeSpecName: "kube-api-access-mlzfp") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "kube-api-access-mlzfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.778845 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.798483 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.856972 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857033 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857054 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/439b686e-927d-425a-a218-807220ae1e95-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857100 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857114 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlzfp\" (UniqueName: \"kubernetes.io/projected/439b686e-927d-425a-a218-807220ae1e95-kube-api-access-mlzfp\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.857134 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.868269 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.891693 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data" (OuterVolumeSpecName: "config-data") pod "439b686e-927d-425a-a218-807220ae1e95" (UID: "439b686e-927d-425a-a218-807220ae1e95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.903537 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.960324 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.960370 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:06 crc kubenswrapper[4816]: I0311 12:20:06.960385 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/439b686e-927d-425a-a218-807220ae1e95-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.265382 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368661 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368771 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368850 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368880 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.368938 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369023 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369112 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369150 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") pod \"a9d3606c-b28d-4028-93fc-535afa127cd6\" (UID: \"a9d3606c-b28d-4028-93fc-535afa127cd6\") " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369375 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs" (OuterVolumeSpecName: "logs") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.369425 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.370101 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.370128 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9d3606c-b28d-4028-93fc-535afa127cd6-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.378131 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz" (OuterVolumeSpecName: "kube-api-access-2tthz") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "kube-api-access-2tthz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.383840 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.398739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts" (OuterVolumeSpecName: "scripts") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.408226 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerStarted","Data":"2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.408454 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-central-agent" containerID="cri-o://29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f" gracePeriod=30 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.408776 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.410392 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="proxy-httpd" containerID="cri-o://2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01" gracePeriod=30 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.410427 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-notification-agent" containerID="cri-o://09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641" gracePeriod=30 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.410370 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="sg-core" containerID="cri-o://caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd" gracePeriod=30 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.427408 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.427744 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"439b686e-927d-425a-a218-807220ae1e95","Type":"ContainerDied","Data":"d6a32b27dcd7e08e03a755df62f2d58811a9c80acc32eb96770a7186a1ec069d"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.427804 4816 scope.go:117] "RemoveContainer" containerID="ecaa5276e0e1e71d262bf64a26711871fc4a429158857be7af8e6465f4bd05ea" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.429520 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.441927 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" event={"ID":"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9","Type":"ContainerStarted","Data":"945268d86ba621c8fa8980dff5e43070ffe204d163fdf1c51439bce4ca2b4338"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.448417 4816 generic.go:334] "Generic (PLEG): container finished" podID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerID="44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" exitCode=0 Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.448500 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerDied","Data":"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.448540 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9d3606c-b28d-4028-93fc-535afa127cd6","Type":"ContainerDied","Data":"ccc820b0417c4ece231f5070aebe453d4f5f6552e1d188623620714789da98ed"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.448607 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.455437 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.062832305 podStartE2EDuration="14.455415144s" podCreationTimestamp="2026-03-11 12:19:53 +0000 UTC" firstStartedPulling="2026-03-11 12:19:54.269261435 +0000 UTC m=+1280.860525402" lastFinishedPulling="2026-03-11 12:20:05.661844274 +0000 UTC m=+1292.253108241" observedRunningTime="2026-03-11 12:20:07.45491743 +0000 UTC m=+1294.046181417" watchObservedRunningTime="2026-03-11 12:20:07.455415144 +0000 UTC m=+1294.046679111" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.456481 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data" (OuterVolumeSpecName: "config-data") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.461853 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" event={"ID":"6268fe92-5c93-43c7-95bc-f30befda5d65","Type":"ContainerStarted","Data":"f424c56cb69a088a064cc5d2e599b7db758b5e10ecf876c1586a8816a4d96acd"} Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479533 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479584 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479598 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479613 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tthz\" (UniqueName: \"kubernetes.io/projected/a9d3606c-b28d-4028-93fc-535afa127cd6-kube-api-access-2tthz\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.479625 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.498866 4816 scope.go:117] "RemoveContainer" containerID="4b5cec87927ba388b30feb742e4d193b529502bf6a8355ed2d02b5d41c560b67" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.504230 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.509985 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.522109 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a9d3606c-b28d-4028-93fc-535afa127cd6" (UID: "a9d3606c-b28d-4028-93fc-535afa127cd6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.527675 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543071 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.543575 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543594 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.543611 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543619 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.543643 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543650 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.543669 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543676 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543847 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543865 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="439b686e-927d-425a-a218-807220ae1e95" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543874 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-log" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.543881 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" containerName="glance-httpd" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.544907 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.548589 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.574307 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.579485 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" podStartSLOduration=2.38695866 podStartE2EDuration="12.579454028s" podCreationTimestamp="2026-03-11 12:19:55 +0000 UTC" firstStartedPulling="2026-03-11 12:19:56.062334 +0000 UTC m=+1282.653597967" lastFinishedPulling="2026-03-11 12:20:06.254829368 +0000 UTC m=+1292.846093335" observedRunningTime="2026-03-11 12:20:07.513701079 +0000 UTC m=+1294.104965046" watchObservedRunningTime="2026-03-11 12:20:07.579454028 +0000 UTC m=+1294.170717995" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.582215 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d3606c-b28d-4028-93fc-535afa127cd6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.582242 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.600339 4816 scope.go:117] "RemoveContainer" containerID="44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.600436 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.627874 4816 scope.go:117] "RemoveContainer" containerID="d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.647104 4816 scope.go:117] "RemoveContainer" containerID="44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.647742 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15\": container with ID starting with 44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15 not found: ID does not exist" containerID="44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.647777 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15"} err="failed to get container status \"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15\": rpc error: code = NotFound desc = could not find container \"44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15\": container with ID starting with 44fdfe2aaa7bb00189c2e1708c4de4cb552c7330addf05e8997e655317268e15 not found: ID does not exist" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.647803 4816 scope.go:117] "RemoveContainer" containerID="d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" Mar 11 12:20:07 crc kubenswrapper[4816]: E0311 12:20:07.648837 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b\": container with ID starting with d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b not found: ID does not exist" containerID="d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.648868 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b"} err="failed to get container status \"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b\": rpc error: code = NotFound desc = could not find container \"d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b\": container with ID starting with d9c605fc0632b2e2c60468a879a920939f818109ba64880a57f9f9b475f1614b not found: ID does not exist" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684402 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684510 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684540 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684603 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684623 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684647 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684664 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.684695 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790638 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790791 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790826 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790848 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790881 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790920 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.790971 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.793512 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.793819 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.794016 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.800347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.812587 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.813442 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.831082 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.834394 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.840306 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.849851 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.854622 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.864434 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.868994 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.874891 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.875348 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.875768 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.885887 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.997794 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.997871 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.997924 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.997988 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.998010 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.998052 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.998340 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:07 crc kubenswrapper[4816]: I0311 12:20:07.998403 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100713 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100851 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100892 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100927 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.100957 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.102225 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.102913 4816 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.104590 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.118845 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.119003 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.119947 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.120231 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.128268 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.151842 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439b686e-927d-425a-a218-807220ae1e95" path="/var/lib/kubelet/pods/439b686e-927d-425a-a218-807220ae1e95/volumes" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.152559 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.155549 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d3606c-b28d-4028-93fc-535afa127cd6" path="/var/lib/kubelet/pods/a9d3606c-b28d-4028-93fc-535afa127cd6/volumes" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.324960 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546522 4816 generic.go:334] "Generic (PLEG): container finished" podID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerID="2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01" exitCode=0 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546557 4816 generic.go:334] "Generic (PLEG): container finished" podID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerID="caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd" exitCode=2 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546564 4816 generic.go:334] "Generic (PLEG): container finished" podID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerID="09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641" exitCode=0 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546573 4816 generic.go:334] "Generic (PLEG): container finished" podID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerID="29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f" exitCode=0 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546615 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01"} Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546649 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd"} Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641"} Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.546670 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f"} Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.736058 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:20:08 crc kubenswrapper[4816]: W0311 12:20:08.741415 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7457f2db_7979_4d92_bd90_a1464b8a3878.slice/crio-722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635 WatchSource:0}: Error finding container 722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635: Status 404 returned error can't find the container with id 722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635 Mar 11 12:20:08 crc kubenswrapper[4816]: I0311 12:20:08.902371 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.044739 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.044839 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.044900 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045035 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045091 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045119 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") pod \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\" (UID: \"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49\") " Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.045732 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.046559 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.052641 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.074917 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts" (OuterVolumeSpecName: "scripts") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.098060 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.137501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb" (OuterVolumeSpecName: "kube-api-access-wc2gb") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "kube-api-access-wc2gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.148975 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.149031 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.149046 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.149057 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc2gb\" (UniqueName: \"kubernetes.io/projected/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-kube-api-access-wc2gb\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.170914 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.179184 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: W0311 12:20:09.183505 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode95ddca0_76d0_4dce_9983_4b07655adc25.slice/crio-f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de WatchSource:0}: Error finding container f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de: Status 404 returned error can't find the container with id f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.240501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data" (OuterVolumeSpecName: "config-data") pod "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" (UID: "ebe67fba-9b24-4bdf-bcb9-d06e979d1e49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.251617 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.251825 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.581465 4816 generic.go:334] "Generic (PLEG): container finished" podID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" containerID="7074db26ba14c2f5793b32a499e15ff64a76fc4764f04041e3b7367e813d1eb6" exitCode=0 Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.581871 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" event={"ID":"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9","Type":"ContainerDied","Data":"7074db26ba14c2f5793b32a499e15ff64a76fc4764f04041e3b7367e813d1eb6"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.585574 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerStarted","Data":"c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.585604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerStarted","Data":"722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.593509 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe67fba-9b24-4bdf-bcb9-d06e979d1e49","Type":"ContainerDied","Data":"bab09cd01583eebdccfb229a37532b6f0674000f5d1606f07d42c8adaa348948"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.593559 4816 scope.go:117] "RemoveContainer" containerID="2482598d21f5c9ee2cffe3291bcb032276779844be127265c434d4ee3a10dd01" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.593679 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.610164 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerStarted","Data":"f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de"} Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.640214 4816 scope.go:117] "RemoveContainer" containerID="caf5f5963ed5f620c8712cea969e0fcf607060ab3626bd9f71ed7c1f2fef14cd" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.641978 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.652986 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672200 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: E0311 12:20:09.672774 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-central-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672798 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-central-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: E0311 12:20:09.672819 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-notification-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672829 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-notification-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: E0311 12:20:09.672848 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="proxy-httpd" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672857 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="proxy-httpd" Mar 11 12:20:09 crc kubenswrapper[4816]: E0311 12:20:09.672877 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="sg-core" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.672883 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="sg-core" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.673058 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="sg-core" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.673081 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-notification-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.673092 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="proxy-httpd" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.673104 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" containerName="ceilometer-central-agent" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.675486 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.680497 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.680520 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.698673 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766318 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766441 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766550 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766610 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766665 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766724 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766765 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.766957 4816 scope.go:117] "RemoveContainer" containerID="09ea84d41bea9be23219e7a701b78afcc81f9e1c777303de3089f54128f5a641" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.834552 4816 scope.go:117] "RemoveContainer" containerID="29fe9c7ac5f65d3d19f417a0d611bc3a79cf763f7ef21444af5a797e56f3f63f" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874070 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874145 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874188 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874213 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874238 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874290 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.874311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.875745 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.876024 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.883478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.884813 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.887381 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.889386 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:09 crc kubenswrapper[4816]: I0311 12:20:09.896314 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") pod \"ceilometer-0\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " pod="openstack/ceilometer-0" Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.042367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.141485 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe67fba-9b24-4bdf-bcb9-d06e979d1e49" path="/var/lib/kubelet/pods/ebe67fba-9b24-4bdf-bcb9-d06e979d1e49/volumes" Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.599263 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:10 crc kubenswrapper[4816]: W0311 12:20:10.606090 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6387790e_663e_4746_9e9f_250ac4a06535.slice/crio-6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79 WatchSource:0}: Error finding container 6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79: Status 404 returned error can't find the container with id 6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79 Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.661431 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerStarted","Data":"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764"} Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.661486 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerStarted","Data":"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25"} Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.667652 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79"} Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.671587 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerStarted","Data":"8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749"} Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.708082 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.708055894 podStartE2EDuration="3.708055894s" podCreationTimestamp="2026-03-11 12:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:10.684265575 +0000 UTC m=+1297.275529542" watchObservedRunningTime="2026-03-11 12:20:10.708055894 +0000 UTC m=+1297.299319861" Mar 11 12:20:10 crc kubenswrapper[4816]: I0311 12:20:10.719698 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.719677126 podStartE2EDuration="3.719677126s" podCreationTimestamp="2026-03-11 12:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:10.71457042 +0000 UTC m=+1297.305834397" watchObservedRunningTime="2026-03-11 12:20:10.719677126 +0000 UTC m=+1297.310941093" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.069504 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.205473 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") pod \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\" (UID: \"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9\") " Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.223516 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf" (OuterVolumeSpecName: "kube-api-access-wdcqf") pod "4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" (UID: "4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9"). InnerVolumeSpecName "kube-api-access-wdcqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.307712 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdcqf\" (UniqueName: \"kubernetes.io/projected/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9-kube-api-access-wdcqf\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.684827 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45"} Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.688400 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.688412 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553860-9kp4n" event={"ID":"4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9","Type":"ContainerDied","Data":"945268d86ba621c8fa8980dff5e43070ffe204d163fdf1c51439bce4ca2b4338"} Mar 11 12:20:11 crc kubenswrapper[4816]: I0311 12:20:11.688520 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="945268d86ba621c8fa8980dff5e43070ffe204d163fdf1c51439bce4ca2b4338" Mar 11 12:20:12 crc kubenswrapper[4816]: I0311 12:20:12.161892 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:20:12 crc kubenswrapper[4816]: I0311 12:20:12.173815 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553854-hbf96"] Mar 11 12:20:12 crc kubenswrapper[4816]: I0311 12:20:12.539213 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:12 crc kubenswrapper[4816]: I0311 12:20:12.700488 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6"} Mar 11 12:20:13 crc kubenswrapper[4816]: I0311 12:20:13.714701 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef"} Mar 11 12:20:14 crc kubenswrapper[4816]: I0311 12:20:14.147305 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8a107b-6295-42d4-b64b-7841171f67f3" path="/var/lib/kubelet/pods/af8a107b-6295-42d4-b64b-7841171f67f3/volumes" Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.745566 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerStarted","Data":"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792"} Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.746514 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.745909 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-central-agent" containerID="cri-o://a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" gracePeriod=30 Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.746143 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="proxy-httpd" containerID="cri-o://50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" gracePeriod=30 Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.746119 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="sg-core" containerID="cri-o://9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" gracePeriod=30 Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.746158 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-notification-agent" containerID="cri-o://1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" gracePeriod=30 Mar 11 12:20:15 crc kubenswrapper[4816]: I0311 12:20:15.786083 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.280106551 podStartE2EDuration="6.786057453s" podCreationTimestamp="2026-03-11 12:20:09 +0000 UTC" firstStartedPulling="2026-03-11 12:20:10.608432318 +0000 UTC m=+1297.199696285" lastFinishedPulling="2026-03-11 12:20:15.11438321 +0000 UTC m=+1301.705647187" observedRunningTime="2026-03-11 12:20:15.775418739 +0000 UTC m=+1302.366682706" watchObservedRunningTime="2026-03-11 12:20:15.786057453 +0000 UTC m=+1302.377321420" Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.759919 4816 generic.go:334] "Generic (PLEG): container finished" podID="6387790e-663e-4746-9e9f-250ac4a06535" containerID="50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" exitCode=0 Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.759970 4816 generic.go:334] "Generic (PLEG): container finished" podID="6387790e-663e-4746-9e9f-250ac4a06535" containerID="9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" exitCode=2 Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.759981 4816 generic.go:334] "Generic (PLEG): container finished" podID="6387790e-663e-4746-9e9f-250ac4a06535" containerID="1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" exitCode=0 Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.760008 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792"} Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.760067 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef"} Mar 11 12:20:16 crc kubenswrapper[4816]: I0311 12:20:16.760084 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6"} Mar 11 12:20:17 crc kubenswrapper[4816]: I0311 12:20:17.887523 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 12:20:17 crc kubenswrapper[4816]: I0311 12:20:17.888073 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 12:20:17 crc kubenswrapper[4816]: I0311 12:20:17.931957 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 12:20:17 crc kubenswrapper[4816]: I0311 12:20:17.939343 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.327178 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.327317 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.362169 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.400777 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.665601 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.785577 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.785760 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.786963 4816 generic.go:334] "Generic (PLEG): container finished" podID="6268fe92-5c93-43c7-95bc-f30befda5d65" containerID="f424c56cb69a088a064cc5d2e599b7db758b5e10ecf876c1586a8816a4d96acd" exitCode=0 Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.786988 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787078 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" event={"ID":"6268fe92-5c93-43c7-95bc-f30befda5d65","Type":"ContainerDied","Data":"f424c56cb69a088a064cc5d2e599b7db758b5e10ecf876c1586a8816a4d96acd"} Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787126 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787317 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787358 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") pod \"6387790e-663e-4746-9e9f-250ac4a06535\" (UID: \"6387790e-663e-4746-9e9f-250ac4a06535\") " Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787821 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.787869 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.788151 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.788172 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387790e-663e-4746-9e9f-250ac4a06535-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.794561 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms" (OuterVolumeSpecName: "kube-api-access-tgmms") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "kube-api-access-tgmms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.795365 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts" (OuterVolumeSpecName: "scripts") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.810422 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.810491 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45"} Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.810607 4816 scope.go:117] "RemoveContainer" containerID="50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.810207 4816 generic.go:334] "Generic (PLEG): container finished" podID="6387790e-663e-4746-9e9f-250ac4a06535" containerID="a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" exitCode=0 Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.813446 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387790e-663e-4746-9e9f-250ac4a06535","Type":"ContainerDied","Data":"6118c1f938ca09df069e1d25ab77da2d96c4cf88ac9d852f756ce969121a9a79"} Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.815628 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.815670 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.815684 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.815693 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.832734 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.890810 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.891025 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.891043 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgmms\" (UniqueName: \"kubernetes.io/projected/6387790e-663e-4746-9e9f-250ac4a06535-kube-api-access-tgmms\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.903632 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data" (OuterVolumeSpecName: "config-data") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.910854 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6387790e-663e-4746-9e9f-250ac4a06535" (UID: "6387790e-663e-4746-9e9f-250ac4a06535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.939075 4816 scope.go:117] "RemoveContainer" containerID="9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.976690 4816 scope.go:117] "RemoveContainer" containerID="1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.993758 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:18 crc kubenswrapper[4816]: I0311 12:20:18.993794 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387790e-663e-4746-9e9f-250ac4a06535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.011430 4816 scope.go:117] "RemoveContainer" containerID="a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.053468 4816 scope.go:117] "RemoveContainer" containerID="50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.054748 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792\": container with ID starting with 50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792 not found: ID does not exist" containerID="50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.054811 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792"} err="failed to get container status \"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792\": rpc error: code = NotFound desc = could not find container \"50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792\": container with ID starting with 50710f72c7fd036d5cf3077365b496702e4d218124f7e7515d8a364a715ce792 not found: ID does not exist" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.054850 4816 scope.go:117] "RemoveContainer" containerID="9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.055625 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef\": container with ID starting with 9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef not found: ID does not exist" containerID="9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.055652 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef"} err="failed to get container status \"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef\": rpc error: code = NotFound desc = could not find container \"9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef\": container with ID starting with 9ad1844ff4b6a10e2e7d0ff4017d82a630d3d5caeee3ee1c608a37d0e8df81ef not found: ID does not exist" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.055668 4816 scope.go:117] "RemoveContainer" containerID="1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.056445 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6\": container with ID starting with 1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6 not found: ID does not exist" containerID="1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.056494 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6"} err="failed to get container status \"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6\": rpc error: code = NotFound desc = could not find container \"1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6\": container with ID starting with 1e66c2af1e1843e1cce30bde9523ec3123ba04a31ae667393c01f3ef14fbcdb6 not found: ID does not exist" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.056513 4816 scope.go:117] "RemoveContainer" containerID="a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.057065 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45\": container with ID starting with a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45 not found: ID does not exist" containerID="a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.057134 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45"} err="failed to get container status \"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45\": rpc error: code = NotFound desc = could not find container \"a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45\": container with ID starting with a566453a6ed7cd3c331201e19eeae628b7b7c5cb4a8edc9454509c8a7f44ce45 not found: ID does not exist" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.164138 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.175111 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.211551 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212046 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-notification-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212071 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-notification-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212101 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="proxy-httpd" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212109 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="proxy-httpd" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212123 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" containerName="oc" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212131 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" containerName="oc" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212142 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="sg-core" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212149 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="sg-core" Mar 11 12:20:19 crc kubenswrapper[4816]: E0311 12:20:19.212166 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-central-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212174 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-central-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212377 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="proxy-httpd" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212392 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" containerName="oc" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212411 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-central-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212423 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="ceilometer-notification-agent" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.212431 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387790e-663e-4746-9e9f-250ac4a06535" containerName="sg-core" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.214498 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.218817 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.219068 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.231209 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404681 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404789 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404879 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.404922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.405313 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507564 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507590 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507615 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507645 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507695 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.507758 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.509003 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.509020 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.513094 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.512411 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.513801 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.514568 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.530526 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") pod \"ceilometer-0\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " pod="openstack/ceilometer-0" Mar 11 12:20:19 crc kubenswrapper[4816]: I0311 12:20:19.554926 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.108116 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.145911 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6387790e-663e-4746-9e9f-250ac4a06535" path="/var/lib/kubelet/pods/6387790e-663e-4746-9e9f-250ac4a06535/volumes" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.175459 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.328720 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") pod \"6268fe92-5c93-43c7-95bc-f30befda5d65\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.328931 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") pod \"6268fe92-5c93-43c7-95bc-f30befda5d65\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.329003 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") pod \"6268fe92-5c93-43c7-95bc-f30befda5d65\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.329045 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") pod \"6268fe92-5c93-43c7-95bc-f30befda5d65\" (UID: \"6268fe92-5c93-43c7-95bc-f30befda5d65\") " Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.336186 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts" (OuterVolumeSpecName: "scripts") pod "6268fe92-5c93-43c7-95bc-f30befda5d65" (UID: "6268fe92-5c93-43c7-95bc-f30befda5d65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.341402 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w" (OuterVolumeSpecName: "kube-api-access-zpt2w") pod "6268fe92-5c93-43c7-95bc-f30befda5d65" (UID: "6268fe92-5c93-43c7-95bc-f30befda5d65"). InnerVolumeSpecName "kube-api-access-zpt2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.361032 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6268fe92-5c93-43c7-95bc-f30befda5d65" (UID: "6268fe92-5c93-43c7-95bc-f30befda5d65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.376813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data" (OuterVolumeSpecName: "config-data") pod "6268fe92-5c93-43c7-95bc-f30befda5d65" (UID: "6268fe92-5c93-43c7-95bc-f30befda5d65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.433490 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.433535 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpt2w\" (UniqueName: \"kubernetes.io/projected/6268fe92-5c93-43c7-95bc-f30befda5d65-kube-api-access-zpt2w\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.433550 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.433561 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6268fe92-5c93-43c7-95bc-f30befda5d65-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.860042 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"38b868cc185bf2881b9763f9f27568b608cb3091bc38e885e64b2566d5c8d41e"} Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.867312 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.867344 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.868530 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.874809 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r2t5s" event={"ID":"6268fe92-5c93-43c7-95bc-f30befda5d65","Type":"ContainerDied","Data":"1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e"} Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.874875 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1939b4a789439a60faf8174002db0eb6692620810e7f2821d03a1a8fa9509b1e" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.946056 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.951392 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.986063 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.986498 4816 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 12:20:20 crc kubenswrapper[4816]: I0311 12:20:20.991008 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.149409 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:20:21 crc kubenswrapper[4816]: E0311 12:20:21.150478 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6268fe92-5c93-43c7-95bc-f30befda5d65" containerName="nova-cell0-conductor-db-sync" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.150496 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6268fe92-5c93-43c7-95bc-f30befda5d65" containerName="nova-cell0-conductor-db-sync" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.150695 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6268fe92-5c93-43c7-95bc-f30befda5d65" containerName="nova-cell0-conductor-db-sync" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.151469 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.154097 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.158608 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f2q4d" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.161376 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.254310 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.254584 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.254713 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.357157 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.357332 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.357371 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.363422 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.365857 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.384898 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") pod \"nova-cell0-conductor-0\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.475028 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.886944 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74"} Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.887516 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b"} Mar 11 12:20:21 crc kubenswrapper[4816]: W0311 12:20:21.986693 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9eb0dee_5bdb_4ca4_a746_d33e8b7d20cc.slice/crio-897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843 WatchSource:0}: Error finding container 897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843: Status 404 returned error can't find the container with id 897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843 Mar 11 12:20:21 crc kubenswrapper[4816]: I0311 12:20:21.991419 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:20:22 crc kubenswrapper[4816]: I0311 12:20:22.895327 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc","Type":"ContainerStarted","Data":"4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2"} Mar 11 12:20:22 crc kubenswrapper[4816]: I0311 12:20:22.896291 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc","Type":"ContainerStarted","Data":"897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843"} Mar 11 12:20:22 crc kubenswrapper[4816]: I0311 12:20:22.922789 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.922763106 podStartE2EDuration="1.922763106s" podCreationTimestamp="2026-03-11 12:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:22.918664619 +0000 UTC m=+1309.509928586" watchObservedRunningTime="2026-03-11 12:20:22.922763106 +0000 UTC m=+1309.514027073" Mar 11 12:20:23 crc kubenswrapper[4816]: I0311 12:20:23.910721 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818"} Mar 11 12:20:23 crc kubenswrapper[4816]: I0311 12:20:23.911188 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:25 crc kubenswrapper[4816]: I0311 12:20:25.936197 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerStarted","Data":"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c"} Mar 11 12:20:25 crc kubenswrapper[4816]: I0311 12:20:25.939094 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:20:25 crc kubenswrapper[4816]: I0311 12:20:25.973562 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.962653178 podStartE2EDuration="6.973539039s" podCreationTimestamp="2026-03-11 12:20:19 +0000 UTC" firstStartedPulling="2026-03-11 12:20:20.118634381 +0000 UTC m=+1306.709898348" lastFinishedPulling="2026-03-11 12:20:25.129520242 +0000 UTC m=+1311.720784209" observedRunningTime="2026-03-11 12:20:25.97045063 +0000 UTC m=+1312.561714597" watchObservedRunningTime="2026-03-11 12:20:25.973539039 +0000 UTC m=+1312.564803006" Mar 11 12:20:31 crc kubenswrapper[4816]: I0311 12:20:31.517937 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.086341 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.088517 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.091703 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.091931 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.101000 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.134660 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.134752 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.134785 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.135017 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.239594 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.239865 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.239943 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.239973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.249558 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.260093 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.297899 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.438619 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.440821 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.467028 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.469768 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.479459 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.479569 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549177 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549779 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549811 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549831 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549878 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549895 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.549910 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.611306 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.626184 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.626364 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.631437 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650606 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650668 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650692 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650765 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650789 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650805 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650836 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650869 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650915 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.650938 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.655040 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.668087 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.668130 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.668451 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.668565 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.687429 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.713193 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.752425 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.752544 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.752591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.752613 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.754990 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.764509 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.773283 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.797142 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") pod \"nova-api-0\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " pod="openstack/nova-api-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.797895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") pod \"nova-cell1-novncproxy-0\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.809888 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.820979 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") pod \"nova-cell0-cell-mapping-qt9tz\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.827359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") pod \"nova-metadata-0\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.905316 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.906940 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.910231 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.915887 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.937440 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.939135 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.953819 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:32 crc kubenswrapper[4816]: I0311 12:20:32.980397 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.013021 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065262 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065322 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065385 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065467 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065694 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065835 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.065922 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.066011 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.066128 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.091675 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168650 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168743 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168795 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168847 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168887 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.168973 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.169061 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.169108 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.170589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.170662 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.171105 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.171660 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.172028 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.174417 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.174471 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.189835 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") pod \"nova-scheduler-0\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.190670 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") pod \"dnsmasq-dns-69b4446475-bsnbn\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.257861 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.276656 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.562651 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.567388 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.576131 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.576414 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.606108 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.687113 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.687931 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.687987 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.688281 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.790804 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.792234 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.792311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.792629 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.805488 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.805939 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.809325 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.831769 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") pod \"nova-cell1-conductor-db-sync-wdblc\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:33 crc kubenswrapper[4816]: I0311 12:20:33.984913 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.026176 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.126214 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:34 crc kubenswrapper[4816]: W0311 12:20:34.276576 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05937283_8ec7_430d_be71_c968e8e97ff1.slice/crio-20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8 WatchSource:0}: Error finding container 20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8: Status 404 returned error can't find the container with id 20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8 Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.293227 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.349160 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.380806 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.392278 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:20:34 crc kubenswrapper[4816]: W0311 12:20:34.422152 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1370549e_42a3_450d_a28d_47d4a0764f56.slice/crio-31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d WatchSource:0}: Error finding container 31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d: Status 404 returned error can't find the container with id 31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d Mar 11 12:20:34 crc kubenswrapper[4816]: I0311 12:20:34.647363 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.054823 4816 generic.go:334] "Generic (PLEG): container finished" podID="1370549e-42a3-450d-a28d-47d4a0764f56" containerID="73c8dd7cd36356a8399521ca85923fa5bea70d1a67253cbf2ad9c716aae771dd" exitCode=0 Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.054955 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerDied","Data":"73c8dd7cd36356a8399521ca85923fa5bea70d1a67253cbf2ad9c716aae771dd"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.055239 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerStarted","Data":"31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.061641 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdblc" event={"ID":"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac","Type":"ContainerStarted","Data":"393812c2bcecafdfa2f0e8dd848197ee6392877cb2e73b5a2b5b12d642a2ed5c"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.061693 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdblc" event={"ID":"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac","Type":"ContainerStarted","Data":"f816942d05a050c01aa9ba2c41f5b875b2c28a8b75fac93c9319285524d0649e"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.066150 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerStarted","Data":"8b2de9f5a79740ef68c68b7f57f77e1e4a7c7e1b2cf9e2d47a43242ce1a5d655"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.071806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qt9tz" event={"ID":"3f519dc2-e88b-4e4b-9637-c3e172b81bfa","Type":"ContainerStarted","Data":"7499f2a9acd657b210b2f77e2cefe97fa749ba96e868296d76960eaa9ed38ee8"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.071846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qt9tz" event={"ID":"3f519dc2-e88b-4e4b-9637-c3e172b81bfa","Type":"ContainerStarted","Data":"f5b3dcc0252b5364b53dcccbdc300f4832e72947bf82b5958b6e65ae5b0eac60"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.086021 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerStarted","Data":"0ff4ead2a33e6228002ecd5e8665db969fa8aef2732966d20f7187976e9cf4b6"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.088600 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05937283-8ec7-430d-be71-c968e8e97ff1","Type":"ContainerStarted","Data":"20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.092469 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33811121-46de-4941-bd74-18ecaa2c2827","Type":"ContainerStarted","Data":"cc7188d0b18641404663ed171ec3812667a2d4778de79e666b89e8d42f9ec1e9"} Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.124216 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qt9tz" podStartSLOduration=3.124183157 podStartE2EDuration="3.124183157s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:35.103752473 +0000 UTC m=+1321.695016450" watchObservedRunningTime="2026-03-11 12:20:35.124183157 +0000 UTC m=+1321.715447124" Mar 11 12:20:35 crc kubenswrapper[4816]: I0311 12:20:35.141433 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wdblc" podStartSLOduration=2.141407599 podStartE2EDuration="2.141407599s" podCreationTimestamp="2026-03-11 12:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:35.132132094 +0000 UTC m=+1321.723396071" watchObservedRunningTime="2026-03-11 12:20:35.141407599 +0000 UTC m=+1321.732671566" Mar 11 12:20:36 crc kubenswrapper[4816]: I0311 12:20:36.984467 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:36 crc kubenswrapper[4816]: I0311 12:20:36.995500 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.181826 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerStarted","Data":"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.195288 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerStarted","Data":"03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.200345 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05937283-8ec7-430d-be71-c968e8e97ff1","Type":"ContainerStarted","Data":"e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.205460 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="33811121-46de-4941-bd74-18ecaa2c2827" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" gracePeriod=30 Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.205543 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33811121-46de-4941-bd74-18ecaa2c2827","Type":"ContainerStarted","Data":"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.210294 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerStarted","Data":"c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6"} Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.210565 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.228988 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.896405128 podStartE2EDuration="6.228968563s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="2026-03-11 12:20:34.280304074 +0000 UTC m=+1320.871568041" lastFinishedPulling="2026-03-11 12:20:37.612867499 +0000 UTC m=+1324.204131476" observedRunningTime="2026-03-11 12:20:38.22431514 +0000 UTC m=+1324.815579107" watchObservedRunningTime="2026-03-11 12:20:38.228968563 +0000 UTC m=+1324.820232530" Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.258192 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.260446 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" podStartSLOduration=6.260422541 podStartE2EDuration="6.260422541s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:38.249951982 +0000 UTC m=+1324.841215949" watchObservedRunningTime="2026-03-11 12:20:38.260422541 +0000 UTC m=+1324.851686508" Mar 11 12:20:38 crc kubenswrapper[4816]: I0311 12:20:38.282481 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.710712743 podStartE2EDuration="6.282451311s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="2026-03-11 12:20:34.040711499 +0000 UTC m=+1320.631975466" lastFinishedPulling="2026-03-11 12:20:37.612450057 +0000 UTC m=+1324.203714034" observedRunningTime="2026-03-11 12:20:38.274524464 +0000 UTC m=+1324.865788431" watchObservedRunningTime="2026-03-11 12:20:38.282451311 +0000 UTC m=+1324.873715278" Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.224185 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerStarted","Data":"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e"} Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.224350 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-log" containerID="cri-o://fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" gracePeriod=30 Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.224455 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-metadata" containerID="cri-o://23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" gracePeriod=30 Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.227798 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerStarted","Data":"40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a"} Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.266543 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.819882126 podStartE2EDuration="7.26651646s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="2026-03-11 12:20:34.165100813 +0000 UTC m=+1320.756364780" lastFinishedPulling="2026-03-11 12:20:37.611735157 +0000 UTC m=+1324.202999114" observedRunningTime="2026-03-11 12:20:39.255949728 +0000 UTC m=+1325.847213695" watchObservedRunningTime="2026-03-11 12:20:39.26651646 +0000 UTC m=+1325.857780427" Mar 11 12:20:39 crc kubenswrapper[4816]: I0311 12:20:39.308116 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.101696878 podStartE2EDuration="7.308091478s" podCreationTimestamp="2026-03-11 12:20:32 +0000 UTC" firstStartedPulling="2026-03-11 12:20:34.405351087 +0000 UTC m=+1320.996615054" lastFinishedPulling="2026-03-11 12:20:37.611745687 +0000 UTC m=+1324.203009654" observedRunningTime="2026-03-11 12:20:39.300432139 +0000 UTC m=+1325.891696106" watchObservedRunningTime="2026-03-11 12:20:39.308091478 +0000 UTC m=+1325.899355445" Mar 11 12:20:40 crc kubenswrapper[4816]: I0311 12:20:40.239489 4816 generic.go:334] "Generic (PLEG): container finished" podID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerID="fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" exitCode=143 Mar 11 12:20:40 crc kubenswrapper[4816]: I0311 12:20:40.239569 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerDied","Data":"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af"} Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.177721 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.253017 4816 generic.go:334] "Generic (PLEG): container finished" podID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerID="23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" exitCode=0 Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.253076 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerDied","Data":"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e"} Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.253112 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1966e6cd-d10e-468d-9e4f-7484f67202b4","Type":"ContainerDied","Data":"8b2de9f5a79740ef68c68b7f57f77e1e4a7c7e1b2cf9e2d47a43242ce1a5d655"} Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.253133 4816 scope.go:117] "RemoveContainer" containerID="23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.254193 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.291404 4816 scope.go:117] "RemoveContainer" containerID="fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.317104 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") pod \"1966e6cd-d10e-468d-9e4f-7484f67202b4\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.317344 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") pod \"1966e6cd-d10e-468d-9e4f-7484f67202b4\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.317453 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") pod \"1966e6cd-d10e-468d-9e4f-7484f67202b4\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.317510 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") pod \"1966e6cd-d10e-468d-9e4f-7484f67202b4\" (UID: \"1966e6cd-d10e-468d-9e4f-7484f67202b4\") " Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.318187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs" (OuterVolumeSpecName: "logs") pod "1966e6cd-d10e-468d-9e4f-7484f67202b4" (UID: "1966e6cd-d10e-468d-9e4f-7484f67202b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.318413 4816 scope.go:117] "RemoveContainer" containerID="23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" Mar 11 12:20:41 crc kubenswrapper[4816]: E0311 12:20:41.319182 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e\": container with ID starting with 23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e not found: ID does not exist" containerID="23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.319223 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e"} err="failed to get container status \"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e\": rpc error: code = NotFound desc = could not find container \"23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e\": container with ID starting with 23478b821003ec3bd794fa4b8451ec55b780ab8eff2e47fa7b900d5f00b65e7e not found: ID does not exist" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.319262 4816 scope.go:117] "RemoveContainer" containerID="fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" Mar 11 12:20:41 crc kubenswrapper[4816]: E0311 12:20:41.319721 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af\": container with ID starting with fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af not found: ID does not exist" containerID="fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.319788 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af"} err="failed to get container status \"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af\": rpc error: code = NotFound desc = could not find container \"fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af\": container with ID starting with fadab9a3a0d55f6711af30c592b5f8e3f3642fcbae90f79d83b3428ffab019af not found: ID does not exist" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.333401 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn" (OuterVolumeSpecName: "kube-api-access-nsrdn") pod "1966e6cd-d10e-468d-9e4f-7484f67202b4" (UID: "1966e6cd-d10e-468d-9e4f-7484f67202b4"). InnerVolumeSpecName "kube-api-access-nsrdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.355974 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data" (OuterVolumeSpecName: "config-data") pod "1966e6cd-d10e-468d-9e4f-7484f67202b4" (UID: "1966e6cd-d10e-468d-9e4f-7484f67202b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.365972 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1966e6cd-d10e-468d-9e4f-7484f67202b4" (UID: "1966e6cd-d10e-468d-9e4f-7484f67202b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.421228 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1966e6cd-d10e-468d-9e4f-7484f67202b4-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.421294 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsrdn\" (UniqueName: \"kubernetes.io/projected/1966e6cd-d10e-468d-9e4f-7484f67202b4-kube-api-access-nsrdn\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.421308 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.421321 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1966e6cd-d10e-468d-9e4f-7484f67202b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.605873 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.624482 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.644917 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:41 crc kubenswrapper[4816]: E0311 12:20:41.645620 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-metadata" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.645656 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-metadata" Mar 11 12:20:41 crc kubenswrapper[4816]: E0311 12:20:41.646182 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-log" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.646205 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-log" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.646529 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-log" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.646564 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" containerName="nova-metadata-metadata" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.648427 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.652828 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.653185 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.674663 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.727926 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.727988 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.728153 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.728197 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.728221 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830717 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830798 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830822 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830890 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.830910 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.831669 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.837538 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.842635 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.849229 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.852201 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") pod \"nova-metadata-0\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " pod="openstack/nova-metadata-0" Mar 11 12:20:41 crc kubenswrapper[4816]: I0311 12:20:41.983134 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:42 crc kubenswrapper[4816]: I0311 12:20:42.156404 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1966e6cd-d10e-468d-9e4f-7484f67202b4" path="/var/lib/kubelet/pods/1966e6cd-d10e-468d-9e4f-7484f67202b4/volumes" Mar 11 12:20:42 crc kubenswrapper[4816]: I0311 12:20:42.534099 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:42 crc kubenswrapper[4816]: I0311 12:20:42.811061 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.091962 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.092065 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.258841 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.278491 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.279680 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerStarted","Data":"9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618"} Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.279757 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerStarted","Data":"7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718"} Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.279777 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerStarted","Data":"9fb01dffc484a29c979e5dd44b3227a5cfb654c600c8c00be0f28ed629855af7"} Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.284298 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" containerID="393812c2bcecafdfa2f0e8dd848197ee6392877cb2e73b5a2b5b12d642a2ed5c" exitCode=0 Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.284355 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdblc" event={"ID":"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac","Type":"ContainerDied","Data":"393812c2bcecafdfa2f0e8dd848197ee6392877cb2e73b5a2b5b12d642a2ed5c"} Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.293450 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.346912 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.367959 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.367933523 podStartE2EDuration="2.367933523s" podCreationTimestamp="2026-03-11 12:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:43.325302775 +0000 UTC m=+1329.916566742" watchObservedRunningTime="2026-03-11 12:20:43.367933523 +0000 UTC m=+1329.959197490" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.391411 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.391701 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="dnsmasq-dns" containerID="cri-o://2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" gracePeriod=10 Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.847119 4816 scope.go:117] "RemoveContainer" containerID="37568547e2b255f52263c2130857ff28c18773cdb28a0d8fb13178ff2dc5ab7f" Mar 11 12:20:43 crc kubenswrapper[4816]: I0311 12:20:43.984263 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.098833 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.098901 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.099110 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.099190 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.099225 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.099341 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") pod \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\" (UID: \"1f7f295b-c30d-49a7-b5fa-b1ae8f705589\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.110540 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q" (OuterVolumeSpecName: "kube-api-access-kwh8q") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "kube-api-access-kwh8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.139819 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.142141 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.184667 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.204776 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.204815 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwh8q\" (UniqueName: \"kubernetes.io/projected/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-kube-api-access-kwh8q\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.209996 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.210266 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.213665 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config" (OuterVolumeSpecName: "config") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.241007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f7f295b-c30d-49a7-b5fa-b1ae8f705589" (UID: "1f7f295b-c30d-49a7-b5fa-b1ae8f705589"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.307430 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.307477 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.307490 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.307501 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f7f295b-c30d-49a7-b5fa-b1ae8f705589-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309401 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerID="2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" exitCode=0 Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309472 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309540 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerDied","Data":"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2"} Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309574 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-7gcck" event={"ID":"1f7f295b-c30d-49a7-b5fa-b1ae8f705589","Type":"ContainerDied","Data":"22b1daa75682bd6ac40d3753e3d1220fc2183f782012b1a65bc963f4cb8ba7ec"} Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.309598 4816 scope.go:117] "RemoveContainer" containerID="2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.355049 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.357269 4816 scope.go:117] "RemoveContainer" containerID="93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.364496 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-7gcck"] Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.399412 4816 scope.go:117] "RemoveContainer" containerID="2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" Mar 11 12:20:44 crc kubenswrapper[4816]: E0311 12:20:44.400098 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2\": container with ID starting with 2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2 not found: ID does not exist" containerID="2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.400150 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2"} err="failed to get container status \"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2\": rpc error: code = NotFound desc = could not find container \"2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2\": container with ID starting with 2e2f32cf352c18f7e4bac10b432260956461e7a4bef8cc47289dc42ec91bc8c2 not found: ID does not exist" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.400187 4816 scope.go:117] "RemoveContainer" containerID="93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603" Mar 11 12:20:44 crc kubenswrapper[4816]: E0311 12:20:44.400614 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603\": container with ID starting with 93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603 not found: ID does not exist" containerID="93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.400649 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603"} err="failed to get container status \"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603\": rpc error: code = NotFound desc = could not find container \"93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603\": container with ID starting with 93f7e9a29f416a66c2cb4ed0a6e544aeef7946ff21347d285940dc6a7bb96603 not found: ID does not exist" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.644342 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.816909 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") pod \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.817537 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") pod \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.817619 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") pod \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.818445 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") pod \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\" (UID: \"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac\") " Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.824620 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc" (OuterVolumeSpecName: "kube-api-access-66txc") pod "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" (UID: "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac"). InnerVolumeSpecName "kube-api-access-66txc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.825518 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts" (OuterVolumeSpecName: "scripts") pod "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" (UID: "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.852217 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data" (OuterVolumeSpecName: "config-data") pod "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" (UID: "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.856931 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" (UID: "fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.920549 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.920583 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.920597 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66txc\" (UniqueName: \"kubernetes.io/projected/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-kube-api-access-66txc\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:44 crc kubenswrapper[4816]: I0311 12:20:44.920606 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.322113 4816 generic.go:334] "Generic (PLEG): container finished" podID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" containerID="7499f2a9acd657b210b2f77e2cefe97fa749ba96e868296d76960eaa9ed38ee8" exitCode=0 Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.322213 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qt9tz" event={"ID":"3f519dc2-e88b-4e4b-9637-c3e172b81bfa","Type":"ContainerDied","Data":"7499f2a9acd657b210b2f77e2cefe97fa749ba96e868296d76960eaa9ed38ee8"} Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.325688 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdblc" event={"ID":"fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac","Type":"ContainerDied","Data":"f816942d05a050c01aa9ba2c41f5b875b2c28a8b75fac93c9319285524d0649e"} Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.325748 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f816942d05a050c01aa9ba2c41f5b875b2c28a8b75fac93c9319285524d0649e" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.325790 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdblc" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.465490 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:20:45 crc kubenswrapper[4816]: E0311 12:20:45.466010 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="dnsmasq-dns" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466026 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="dnsmasq-dns" Mar 11 12:20:45 crc kubenswrapper[4816]: E0311 12:20:45.466046 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="init" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466052 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="init" Mar 11 12:20:45 crc kubenswrapper[4816]: E0311 12:20:45.466059 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" containerName="nova-cell1-conductor-db-sync" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466066 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" containerName="nova-cell1-conductor-db-sync" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466312 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" containerName="dnsmasq-dns" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.466343 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" containerName="nova-cell1-conductor-db-sync" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.467472 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.470671 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.480489 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.636973 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.637087 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.637177 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.740041 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.740106 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.740143 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.744688 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.758440 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.759918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") pod \"nova-cell1-conductor-0\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:45 crc kubenswrapper[4816]: I0311 12:20:45.801998 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.146352 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7f295b-c30d-49a7-b5fa-b1ae8f705589" path="/var/lib/kubelet/pods/1f7f295b-c30d-49a7-b5fa-b1ae8f705589/volumes" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.294756 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:20:46 crc kubenswrapper[4816]: W0311 12:20:46.298569 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63567eba_cc2a_4168_9e81_51c1daed5482.slice/crio-188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7 WatchSource:0}: Error finding container 188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7: Status 404 returned error can't find the container with id 188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7 Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.336729 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"63567eba-cc2a-4168-9e81-51c1daed5482","Type":"ContainerStarted","Data":"188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7"} Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.746279 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.869694 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") pod \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.869761 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") pod \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.869815 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") pod \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.869991 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") pod \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\" (UID: \"3f519dc2-e88b-4e4b-9637-c3e172b81bfa\") " Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.875589 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr" (OuterVolumeSpecName: "kube-api-access-fs7hr") pod "3f519dc2-e88b-4e4b-9637-c3e172b81bfa" (UID: "3f519dc2-e88b-4e4b-9637-c3e172b81bfa"). InnerVolumeSpecName "kube-api-access-fs7hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.875881 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts" (OuterVolumeSpecName: "scripts") pod "3f519dc2-e88b-4e4b-9637-c3e172b81bfa" (UID: "3f519dc2-e88b-4e4b-9637-c3e172b81bfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.903949 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f519dc2-e88b-4e4b-9637-c3e172b81bfa" (UID: "3f519dc2-e88b-4e4b-9637-c3e172b81bfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.914110 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data" (OuterVolumeSpecName: "config-data") pod "3f519dc2-e88b-4e4b-9637-c3e172b81bfa" (UID: "3f519dc2-e88b-4e4b-9637-c3e172b81bfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.971958 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.972001 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.972009 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.972022 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs7hr\" (UniqueName: \"kubernetes.io/projected/3f519dc2-e88b-4e4b-9637-c3e172b81bfa-kube-api-access-fs7hr\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.983862 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:20:46 crc kubenswrapper[4816]: I0311 12:20:46.983934 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.348060 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"63567eba-cc2a-4168-9e81-51c1daed5482","Type":"ContainerStarted","Data":"adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6"} Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.349860 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.352394 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qt9tz" event={"ID":"3f519dc2-e88b-4e4b-9637-c3e172b81bfa","Type":"ContainerDied","Data":"f5b3dcc0252b5364b53dcccbdc300f4832e72947bf82b5958b6e65ae5b0eac60"} Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.352443 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b3dcc0252b5364b53dcccbdc300f4832e72947bf82b5958b6e65ae5b0eac60" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.352510 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qt9tz" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.385524 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.385492961 podStartE2EDuration="2.385492961s" podCreationTimestamp="2026-03-11 12:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:47.372804218 +0000 UTC m=+1333.964068185" watchObservedRunningTime="2026-03-11 12:20:47.385492961 +0000 UTC m=+1333.976756928" Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.528108 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.528480 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" containerID="cri-o://03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907" gracePeriod=30 Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.528627 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" containerID="cri-o://40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a" gracePeriod=30 Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.538172 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.538486 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" containerID="cri-o://e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" gracePeriod=30 Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.559102 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.559612 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-log" containerID="cri-o://7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718" gracePeriod=30 Mar 11 12:20:47 crc kubenswrapper[4816]: I0311 12:20:47.559728 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-metadata" containerID="cri-o://9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618" gracePeriod=30 Mar 11 12:20:48 crc kubenswrapper[4816]: E0311 12:20:48.261643 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:20:48 crc kubenswrapper[4816]: E0311 12:20:48.267163 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:20:48 crc kubenswrapper[4816]: E0311 12:20:48.268904 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:20:48 crc kubenswrapper[4816]: E0311 12:20:48.268948 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.380880 4816 generic.go:334] "Generic (PLEG): container finished" podID="421dad23-2283-4534-b064-250972bc1863" containerID="9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618" exitCode=0 Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.380935 4816 generic.go:334] "Generic (PLEG): container finished" podID="421dad23-2283-4534-b064-250972bc1863" containerID="7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718" exitCode=143 Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.381007 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerDied","Data":"9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618"} Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.381097 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerDied","Data":"7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718"} Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.384125 4816 generic.go:334] "Generic (PLEG): container finished" podID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerID="03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907" exitCode=143 Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.384214 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerDied","Data":"03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907"} Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.514414 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.607735 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.608380 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.608613 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.608830 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.609003 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") pod \"421dad23-2283-4534-b064-250972bc1863\" (UID: \"421dad23-2283-4534-b064-250972bc1863\") " Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.609048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs" (OuterVolumeSpecName: "logs") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.609858 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/421dad23-2283-4534-b064-250972bc1863-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.617019 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6" (OuterVolumeSpecName: "kube-api-access-29wd6") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "kube-api-access-29wd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.637523 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data" (OuterVolumeSpecName: "config-data") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.639059 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.676368 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "421dad23-2283-4534-b064-250972bc1863" (UID: "421dad23-2283-4534-b064-250972bc1863"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.711594 4816 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.711696 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29wd6\" (UniqueName: \"kubernetes.io/projected/421dad23-2283-4534-b064-250972bc1863-kube-api-access-29wd6\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.711710 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:48 crc kubenswrapper[4816]: I0311 12:20:48.711721 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421dad23-2283-4534-b064-250972bc1863-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.395837 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"421dad23-2283-4534-b064-250972bc1863","Type":"ContainerDied","Data":"9fb01dffc484a29c979e5dd44b3227a5cfb654c600c8c00be0f28ed629855af7"} Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.395905 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.395931 4816 scope.go:117] "RemoveContainer" containerID="9a07b967847536eab2a5c61594718aad5b432c59b70b5d223ca12c9d44afd618" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.431380 4816 scope.go:117] "RemoveContainer" containerID="7df3059467c64aeccef2f0dfb8f6972acb1a50b5456b5dc1259a2eb0aaddb718" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.443908 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.455705 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.465597 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:49 crc kubenswrapper[4816]: E0311 12:20:49.466667 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-log" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466687 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-log" Mar 11 12:20:49 crc kubenswrapper[4816]: E0311 12:20:49.466702 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" containerName="nova-manage" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466710 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" containerName="nova-manage" Mar 11 12:20:49 crc kubenswrapper[4816]: E0311 12:20:49.466728 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-metadata" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466736 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-metadata" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466951 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-metadata" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466972 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" containerName="nova-manage" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.466983 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="421dad23-2283-4534-b064-250972bc1863" containerName="nova-metadata-log" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.468335 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.471601 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.471878 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.485834 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.560728 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634133 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634363 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634408 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634433 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.634632 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737481 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737607 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737755 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737863 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.737915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.738513 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.746642 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.747450 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.748185 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.762424 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") pod \"nova-metadata-0\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " pod="openstack/nova-metadata-0" Mar 11 12:20:49 crc kubenswrapper[4816]: I0311 12:20:49.798152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.155376 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421dad23-2283-4534-b064-250972bc1863" path="/var/lib/kubelet/pods/421dad23-2283-4534-b064-250972bc1863/volumes" Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.379750 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.441323 4816 generic.go:334] "Generic (PLEG): container finished" podID="05937283-8ec7-430d-be71-c968e8e97ff1" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" exitCode=0 Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.442403 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05937283-8ec7-430d-be71-c968e8e97ff1","Type":"ContainerDied","Data":"e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42"} Mar 11 12:20:50 crc kubenswrapper[4816]: I0311 12:20:50.895284 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.070369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") pod \"05937283-8ec7-430d-be71-c968e8e97ff1\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.070464 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") pod \"05937283-8ec7-430d-be71-c968e8e97ff1\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.070512 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") pod \"05937283-8ec7-430d-be71-c968e8e97ff1\" (UID: \"05937283-8ec7-430d-be71-c968e8e97ff1\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.078131 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt" (OuterVolumeSpecName: "kube-api-access-7fdrt") pod "05937283-8ec7-430d-be71-c968e8e97ff1" (UID: "05937283-8ec7-430d-be71-c968e8e97ff1"). InnerVolumeSpecName "kube-api-access-7fdrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.118186 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05937283-8ec7-430d-be71-c968e8e97ff1" (UID: "05937283-8ec7-430d-be71-c968e8e97ff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.125155 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data" (OuterVolumeSpecName: "config-data") pod "05937283-8ec7-430d-be71-c968e8e97ff1" (UID: "05937283-8ec7-430d-be71-c968e8e97ff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.179374 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fdrt\" (UniqueName: \"kubernetes.io/projected/05937283-8ec7-430d-be71-c968e8e97ff1-kube-api-access-7fdrt\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.179431 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.179446 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05937283-8ec7-430d-be71-c968e8e97ff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.458393 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerStarted","Data":"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.458473 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerStarted","Data":"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.458495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerStarted","Data":"3d6f4a92fab1ae4820eecc5176239bbe544418957d5b8b49929c39dc6ee8800c"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.464925 4816 generic.go:334] "Generic (PLEG): container finished" podID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerID="40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a" exitCode=0 Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.465024 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerDied","Data":"40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.466499 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05937283-8ec7-430d-be71-c968e8e97ff1","Type":"ContainerDied","Data":"20c9b17fece99c2795c0705a485bb232956e3243ab02757479fd7900fae5c7a8"} Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.466546 4816 scope.go:117] "RemoveContainer" containerID="e9ad6464f8fd694a00b9feffc15da0986d481cd62391df808bf2e196c5e1ad42" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.466763 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.497297 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.509341 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.509308814 podStartE2EDuration="2.509308814s" podCreationTimestamp="2026-03-11 12:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:51.490405234 +0000 UTC m=+1338.081669201" watchObservedRunningTime="2026-03-11 12:20:51.509308814 +0000 UTC m=+1338.100572781" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.543126 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.552886 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.564272 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:51 crc kubenswrapper[4816]: E0311 12:20:51.564871 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.564888 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" Mar 11 12:20:51 crc kubenswrapper[4816]: E0311 12:20:51.564935 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.564941 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" Mar 11 12:20:51 crc kubenswrapper[4816]: E0311 12:20:51.564953 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.564959 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.565164 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-log" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.565188 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" containerName="nova-scheduler-scheduler" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.565207 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" containerName="nova-api-api" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.566035 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.568420 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.585059 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.593401 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") pod \"78b435e0-53bf-4f8c-aef9-49b170fc9519\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.593486 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") pod \"78b435e0-53bf-4f8c-aef9-49b170fc9519\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.593565 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") pod \"78b435e0-53bf-4f8c-aef9-49b170fc9519\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.593661 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") pod \"78b435e0-53bf-4f8c-aef9-49b170fc9519\" (UID: \"78b435e0-53bf-4f8c-aef9-49b170fc9519\") " Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.598749 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs" (OuterVolumeSpecName: "logs") pod "78b435e0-53bf-4f8c-aef9-49b170fc9519" (UID: "78b435e0-53bf-4f8c-aef9-49b170fc9519"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.599601 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd" (OuterVolumeSpecName: "kube-api-access-wlcbd") pod "78b435e0-53bf-4f8c-aef9-49b170fc9519" (UID: "78b435e0-53bf-4f8c-aef9-49b170fc9519"). InnerVolumeSpecName "kube-api-access-wlcbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.623813 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78b435e0-53bf-4f8c-aef9-49b170fc9519" (UID: "78b435e0-53bf-4f8c-aef9-49b170fc9519"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.623871 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data" (OuterVolumeSpecName: "config-data") pod "78b435e0-53bf-4f8c-aef9-49b170fc9519" (UID: "78b435e0-53bf-4f8c-aef9-49b170fc9519"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.699700 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700267 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700592 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b435e0-53bf-4f8c-aef9-49b170fc9519-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700618 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlcbd\" (UniqueName: \"kubernetes.io/projected/78b435e0-53bf-4f8c-aef9-49b170fc9519-kube-api-access-wlcbd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700634 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.700648 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b435e0-53bf-4f8c-aef9-49b170fc9519-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.802552 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.803188 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.803221 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.808606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.809097 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.830817 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") pod \"nova-scheduler-0\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " pod="openstack/nova-scheduler-0" Mar 11 12:20:51 crc kubenswrapper[4816]: I0311 12:20:51.886540 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.145687 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05937283-8ec7-430d-be71-c968e8e97ff1" path="/var/lib/kubelet/pods/05937283-8ec7-430d-be71-c968e8e97ff1/volumes" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.364144 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.488475 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78b435e0-53bf-4f8c-aef9-49b170fc9519","Type":"ContainerDied","Data":"0ff4ead2a33e6228002ecd5e8665db969fa8aef2732966d20f7187976e9cf4b6"} Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.488555 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.488596 4816 scope.go:117] "RemoveContainer" containerID="40e1356e9811fbd25787534e33a830de5168253a4c4ac07e43758ab056b21a5a" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.498134 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49c3f447-334e-4147-b877-22a0ce6e3345","Type":"ContainerStarted","Data":"80c23e1f7785724059b85c847854192b3471a718a42ed80849445c1edfb1f7c4"} Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.532100 4816 scope.go:117] "RemoveContainer" containerID="03715e3af42eb6ce517cc3fbf23a4e260b58184bdafa92056416df870db5e907" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.532854 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.550009 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.555964 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.557845 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.561821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.585993 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.729555 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.729649 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.729674 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.729833 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.832031 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.832459 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.832803 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.832883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.833067 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.841138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.844962 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.855683 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") pod \"nova-api-0\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " pod="openstack/nova-api-0" Mar 11 12:20:52 crc kubenswrapper[4816]: I0311 12:20:52.918445 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.420927 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.513108 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49c3f447-334e-4147-b877-22a0ce6e3345","Type":"ContainerStarted","Data":"4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d"} Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.521095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerStarted","Data":"3161af8bf1555f75f7e3fe8c5b6c7028f30e608b5088c5375d09c6a61566d4c9"} Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.542720 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.542692005 podStartE2EDuration="2.542692005s" podCreationTimestamp="2026-03-11 12:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:53.538684181 +0000 UTC m=+1340.129948148" watchObservedRunningTime="2026-03-11 12:20:53.542692005 +0000 UTC m=+1340.133955972" Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.835223 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:53 crc kubenswrapper[4816]: I0311 12:20:53.835993 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerName="kube-state-metrics" containerID="cri-o://679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" gracePeriod=30 Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.146566 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b435e0-53bf-4f8c-aef9-49b170fc9519" path="/var/lib/kubelet/pods/78b435e0-53bf-4f8c-aef9-49b170fc9519/volumes" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.345608 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.481133 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") pod \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\" (UID: \"8e9e4e8b-b60c-4c37-974a-8bdc1b243135\") " Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.488697 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg" (OuterVolumeSpecName: "kube-api-access-cbflg") pod "8e9e4e8b-b60c-4c37-974a-8bdc1b243135" (UID: "8e9e4e8b-b60c-4c37-974a-8bdc1b243135"). InnerVolumeSpecName "kube-api-access-cbflg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.586209 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbflg\" (UniqueName: \"kubernetes.io/projected/8e9e4e8b-b60c-4c37-974a-8bdc1b243135-kube-api-access-cbflg\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.588995 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerStarted","Data":"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a"} Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.589051 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerStarted","Data":"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9"} Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.596986 4816 generic.go:334] "Generic (PLEG): container finished" podID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerID="679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" exitCode=2 Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.597103 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.597154 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e9e4e8b-b60c-4c37-974a-8bdc1b243135","Type":"ContainerDied","Data":"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b"} Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.597187 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e9e4e8b-b60c-4c37-974a-8bdc1b243135","Type":"ContainerDied","Data":"e914685ae7eb058c653bc79edb98cb710a39f5ce6911740300b8ce8933b04af8"} Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.597208 4816 scope.go:117] "RemoveContainer" containerID="679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.616170 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.616149469 podStartE2EDuration="2.616149469s" podCreationTimestamp="2026-03-11 12:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:20:54.608922742 +0000 UTC m=+1341.200186709" watchObservedRunningTime="2026-03-11 12:20:54.616149469 +0000 UTC m=+1341.207413436" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.642682 4816 scope.go:117] "RemoveContainer" containerID="679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" Mar 11 12:20:54 crc kubenswrapper[4816]: E0311 12:20:54.643177 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b\": container with ID starting with 679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b not found: ID does not exist" containerID="679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.643221 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b"} err="failed to get container status \"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b\": rpc error: code = NotFound desc = could not find container \"679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b\": container with ID starting with 679abb8cec4559fafe708b16a4cb668342ecfb7db87736a8f89c0b2ddfbcfb1b not found: ID does not exist" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.651890 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.663456 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.678845 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:54 crc kubenswrapper[4816]: E0311 12:20:54.679593 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerName="kube-state-metrics" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.679623 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerName="kube-state-metrics" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.680005 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" containerName="kube-state-metrics" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.681097 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.684141 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.684232 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.692357 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.692514 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.692685 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.692920 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.705663 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.795678 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.795776 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.795845 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.795951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.799014 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.799416 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.801861 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.802402 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.814086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:54 crc kubenswrapper[4816]: I0311 12:20:54.821689 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") pod \"kube-state-metrics-0\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " pod="openstack/kube-state-metrics-0" Mar 11 12:20:55 crc kubenswrapper[4816]: I0311 12:20:55.008568 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:20:55 crc kubenswrapper[4816]: I0311 12:20:55.530749 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:20:55 crc kubenswrapper[4816]: I0311 12:20:55.607361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32dcc96b-186a-444d-bef3-4c5f117ee652","Type":"ContainerStarted","Data":"1e343e65b4d8cc4645e88fc1c1a55d93ec648ea21d55e4018feab7481fc909e7"} Mar 11 12:20:55 crc kubenswrapper[4816]: I0311 12:20:55.843121 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.046618 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.046962 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-central-agent" containerID="cri-o://3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" gracePeriod=30 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.048744 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="proxy-httpd" containerID="cri-o://46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" gracePeriod=30 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.051171 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-notification-agent" containerID="cri-o://cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" gracePeriod=30 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.051969 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="sg-core" containerID="cri-o://91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" gracePeriod=30 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.143565 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9e4e8b-b60c-4c37-974a-8bdc1b243135" path="/var/lib/kubelet/pods/8e9e4e8b-b60c-4c37-974a-8bdc1b243135/volumes" Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624394 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerID="46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" exitCode=0 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624434 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerID="91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" exitCode=2 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624444 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerID="3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" exitCode=0 Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624492 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c"} Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624522 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818"} Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.624532 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b"} Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.626637 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32dcc96b-186a-444d-bef3-4c5f117ee652","Type":"ContainerStarted","Data":"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c"} Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.627504 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.664163 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.24142379 podStartE2EDuration="2.664132947s" podCreationTimestamp="2026-03-11 12:20:54 +0000 UTC" firstStartedPulling="2026-03-11 12:20:55.538625988 +0000 UTC m=+1342.129889945" lastFinishedPulling="2026-03-11 12:20:55.961335135 +0000 UTC m=+1342.552599102" observedRunningTime="2026-03-11 12:20:56.653500903 +0000 UTC m=+1343.244764880" watchObservedRunningTime="2026-03-11 12:20:56.664132947 +0000 UTC m=+1343.255396914" Mar 11 12:20:56 crc kubenswrapper[4816]: I0311 12:20:56.886735 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.595411 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657563 4816 generic.go:334] "Generic (PLEG): container finished" podID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerID="cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" exitCode=0 Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657635 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74"} Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657679 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657699 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a9b124c-68d8-44e9-9381-fa448155ef23","Type":"ContainerDied","Data":"38b868cc185bf2881b9763f9f27568b608cb3091bc38e885e64b2566d5c8d41e"} Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.657729 4816 scope.go:117] "RemoveContainer" containerID="46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.682593 4816 scope.go:117] "RemoveContainer" containerID="91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.694802 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.694843 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.694905 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.695915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.696098 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.696754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.696598 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.696837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") pod \"1a9b124c-68d8-44e9-9381-fa448155ef23\" (UID: \"1a9b124c-68d8-44e9-9381-fa448155ef23\") " Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.697188 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.698046 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.698066 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a9b124c-68d8-44e9-9381-fa448155ef23-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.715575 4816 scope.go:117] "RemoveContainer" containerID="cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.715594 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts" (OuterVolumeSpecName: "scripts") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.715714 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr" (OuterVolumeSpecName: "kube-api-access-d8dwr") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "kube-api-access-d8dwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.750739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.800821 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8dwr\" (UniqueName: \"kubernetes.io/projected/1a9b124c-68d8-44e9-9381-fa448155ef23-kube-api-access-d8dwr\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.800882 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.800902 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.806651 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.835348 4816 scope.go:117] "RemoveContainer" containerID="3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.841527 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data" (OuterVolumeSpecName: "config-data") pod "1a9b124c-68d8-44e9-9381-fa448155ef23" (UID: "1a9b124c-68d8-44e9-9381-fa448155ef23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.867979 4816 scope.go:117] "RemoveContainer" containerID="46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" Mar 11 12:20:58 crc kubenswrapper[4816]: E0311 12:20:58.868821 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c\": container with ID starting with 46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c not found: ID does not exist" containerID="46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.868855 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c"} err="failed to get container status \"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c\": rpc error: code = NotFound desc = could not find container \"46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c\": container with ID starting with 46378e5e41bccbfb3a48252c8c7374cb11b3d1fe0796984239fa59f5ef21eb1c not found: ID does not exist" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.868878 4816 scope.go:117] "RemoveContainer" containerID="91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" Mar 11 12:20:58 crc kubenswrapper[4816]: E0311 12:20:58.869368 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818\": container with ID starting with 91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818 not found: ID does not exist" containerID="91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.869424 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818"} err="failed to get container status \"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818\": rpc error: code = NotFound desc = could not find container \"91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818\": container with ID starting with 91c0f336c13b096a6e37338c7aacd50b6dc2e9bbedfd8f453b0c731b71198818 not found: ID does not exist" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.869470 4816 scope.go:117] "RemoveContainer" containerID="cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" Mar 11 12:20:58 crc kubenswrapper[4816]: E0311 12:20:58.869831 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74\": container with ID starting with cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74 not found: ID does not exist" containerID="cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.869877 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74"} err="failed to get container status \"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74\": rpc error: code = NotFound desc = could not find container \"cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74\": container with ID starting with cf672d6f4e16a27f8ebd02361ddc8dc18ed8b08979ac551c707e61af02836a74 not found: ID does not exist" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.869897 4816 scope.go:117] "RemoveContainer" containerID="3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" Mar 11 12:20:58 crc kubenswrapper[4816]: E0311 12:20:58.870145 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b\": container with ID starting with 3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b not found: ID does not exist" containerID="3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.870176 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b"} err="failed to get container status \"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b\": rpc error: code = NotFound desc = could not find container \"3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b\": container with ID starting with 3b2e12f9db266056fff6910a7cbbd6b8e5b64ba5455f9ba9db2932c38e6dd28b not found: ID does not exist" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.902986 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.903025 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9b124c-68d8-44e9-9381-fa448155ef23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:20:58 crc kubenswrapper[4816]: I0311 12:20:58.999785 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.019715 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.047894 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:59 crc kubenswrapper[4816]: E0311 12:20:59.048849 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-central-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.048870 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-central-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: E0311 12:20:59.048892 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-notification-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.048901 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-notification-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: E0311 12:20:59.048915 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="sg-core" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.048922 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="sg-core" Mar 11 12:20:59 crc kubenswrapper[4816]: E0311 12:20:59.048977 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="proxy-httpd" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.048986 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="proxy-httpd" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.049175 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-notification-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.049197 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="proxy-httpd" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.049257 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="sg-core" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.049272 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" containerName="ceilometer-central-agent" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.054530 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.060101 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.060399 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.060533 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.075086 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.106982 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107326 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107547 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107642 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107714 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107822 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.107915 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.108012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.209281 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.209660 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.209788 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.209906 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210023 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210413 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210722 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.210990 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.211189 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.215042 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.215579 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.216043 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.216465 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.221161 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.239135 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") pod \"ceilometer-0\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.378829 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.799662 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.800118 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 12:20:59 crc kubenswrapper[4816]: I0311 12:20:59.859324 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.145845 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9b124c-68d8-44e9-9381-fa448155ef23" path="/var/lib/kubelet/pods/1a9b124c-68d8-44e9-9381-fa448155ef23/volumes" Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.695879 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff"} Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.696377 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"27271a41ca8a8c294fc6aef8e91d6f6174e9af95c25d74da01a3a36774d13fba"} Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.812479 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:00 crc kubenswrapper[4816]: I0311 12:21:00.812550 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:01 crc kubenswrapper[4816]: I0311 12:21:01.713726 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b"} Mar 11 12:21:01 crc kubenswrapper[4816]: I0311 12:21:01.890641 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 12:21:01 crc kubenswrapper[4816]: I0311 12:21:01.925565 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 12:21:02 crc kubenswrapper[4816]: I0311 12:21:02.728343 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec"} Mar 11 12:21:02 crc kubenswrapper[4816]: I0311 12:21:02.774228 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 12:21:02 crc kubenswrapper[4816]: I0311 12:21:02.918779 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:21:02 crc kubenswrapper[4816]: I0311 12:21:02.918843 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.001581 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.001601 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.751228 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerStarted","Data":"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763"} Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.753213 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:21:04 crc kubenswrapper[4816]: I0311 12:21:04.780241 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.752034614 podStartE2EDuration="5.780213425s" podCreationTimestamp="2026-03-11 12:20:59 +0000 UTC" firstStartedPulling="2026-03-11 12:20:59.858585535 +0000 UTC m=+1346.449849502" lastFinishedPulling="2026-03-11 12:21:03.886764346 +0000 UTC m=+1350.478028313" observedRunningTime="2026-03-11 12:21:04.774267395 +0000 UTC m=+1351.365531362" watchObservedRunningTime="2026-03-11 12:21:04.780213425 +0000 UTC m=+1351.371477392" Mar 11 12:21:05 crc kubenswrapper[4816]: I0311 12:21:05.021849 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.689472 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.762934 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") pod \"33811121-46de-4941-bd74-18ecaa2c2827\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.763168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") pod \"33811121-46de-4941-bd74-18ecaa2c2827\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.763283 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") pod \"33811121-46de-4941-bd74-18ecaa2c2827\" (UID: \"33811121-46de-4941-bd74-18ecaa2c2827\") " Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.773625 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6" (OuterVolumeSpecName: "kube-api-access-t4xh6") pod "33811121-46de-4941-bd74-18ecaa2c2827" (UID: "33811121-46de-4941-bd74-18ecaa2c2827"). InnerVolumeSpecName "kube-api-access-t4xh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.800267 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data" (OuterVolumeSpecName: "config-data") pod "33811121-46de-4941-bd74-18ecaa2c2827" (UID: "33811121-46de-4941-bd74-18ecaa2c2827"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801398 4816 generic.go:334] "Generic (PLEG): container finished" podID="33811121-46de-4941-bd74-18ecaa2c2827" containerID="63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" exitCode=137 Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801454 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33811121-46de-4941-bd74-18ecaa2c2827","Type":"ContainerDied","Data":"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0"} Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33811121-46de-4941-bd74-18ecaa2c2827","Type":"ContainerDied","Data":"cc7188d0b18641404663ed171ec3812667a2d4778de79e666b89e8d42f9ec1e9"} Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801514 4816 scope.go:117] "RemoveContainer" containerID="63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.801763 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.809034 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33811121-46de-4941-bd74-18ecaa2c2827" (UID: "33811121-46de-4941-bd74-18ecaa2c2827"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.865523 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.865575 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4xh6\" (UniqueName: \"kubernetes.io/projected/33811121-46de-4941-bd74-18ecaa2c2827-kube-api-access-t4xh6\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.865589 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33811121-46de-4941-bd74-18ecaa2c2827-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.886812 4816 scope.go:117] "RemoveContainer" containerID="63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" Mar 11 12:21:08 crc kubenswrapper[4816]: E0311 12:21:08.887345 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0\": container with ID starting with 63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0 not found: ID does not exist" containerID="63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0" Mar 11 12:21:08 crc kubenswrapper[4816]: I0311 12:21:08.887424 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0"} err="failed to get container status \"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0\": rpc error: code = NotFound desc = could not find container \"63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0\": container with ID starting with 63a3d6bc49f6f2dab318870b71436674479a2cc4b62d79ccb00a4a4a963f01a0 not found: ID does not exist" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.154185 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.168158 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.203343 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:09 crc kubenswrapper[4816]: E0311 12:21:09.207514 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33811121-46de-4941-bd74-18ecaa2c2827" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.207577 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="33811121-46de-4941-bd74-18ecaa2c2827" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.208188 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="33811121-46de-4941-bd74-18ecaa2c2827" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.209895 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.220143 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.220492 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.220797 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.227334 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.277834 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.277941 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.278379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.278455 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.278494 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.381835 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.381946 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.381988 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.382040 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.382085 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.388676 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.390478 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.390991 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.400325 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.400797 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") pod \"nova-cell1-novncproxy-0\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.514679 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.514737 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.554580 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.813128 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.814888 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 12:21:09 crc kubenswrapper[4816]: I0311 12:21:09.831980 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.071586 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.162492 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33811121-46de-4941-bd74-18ecaa2c2827" path="/var/lib/kubelet/pods/33811121-46de-4941-bd74-18ecaa2c2827/volumes" Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.831846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fd796be0-d1ac-47be-8162-3b1c42febc0a","Type":"ContainerStarted","Data":"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e"} Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.831925 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fd796be0-d1ac-47be-8162-3b1c42febc0a","Type":"ContainerStarted","Data":"73799c30d5d3ab5fe26ad3cf5939299dea4d34493e455f7bcdac484f34941957"} Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.858120 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.858089653 podStartE2EDuration="1.858089653s" podCreationTimestamp="2026-03-11 12:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:10.852105612 +0000 UTC m=+1357.443369579" watchObservedRunningTime="2026-03-11 12:21:10.858089653 +0000 UTC m=+1357.449353650" Mar 11 12:21:10 crc kubenswrapper[4816]: I0311 12:21:10.860343 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.924807 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.925997 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.926911 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.927195 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.930816 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 12:21:12 crc kubenswrapper[4816]: I0311 12:21:12.932556 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.203897 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.206410 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.223852 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294289 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294626 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294766 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.294886 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.295020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396209 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396346 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396413 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396455 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396499 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.396522 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.397699 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.397713 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.397824 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.398139 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.398479 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.417694 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") pod \"dnsmasq-dns-fdb8f6449-7h7r8\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:13 crc kubenswrapper[4816]: I0311 12:21:13.542971 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.099292 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.555675 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.878145 4816 generic.go:334] "Generic (PLEG): container finished" podID="32a279c7-00a8-4e98-8356-91e219416a22" containerID="1876def9a0f72b0ad981ff600f29fd745c0daa03affcc6a0a2083718b834badc" exitCode=0 Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.878423 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerDied","Data":"1876def9a0f72b0ad981ff600f29fd745c0daa03affcc6a0a2083718b834badc"} Mar 11 12:21:14 crc kubenswrapper[4816]: I0311 12:21:14.880001 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerStarted","Data":"88f0e5edf59a2c15eb9814f01d499e770f690a88f8bf62d0decdbb14e939c9e6"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.321360 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.322098 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-central-agent" containerID="cri-o://f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" gracePeriod=30 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.322158 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" containerID="cri-o://692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" gracePeriod=30 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.322223 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-notification-agent" containerID="cri-o://5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" gracePeriod=30 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.322158 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="sg-core" containerID="cri-o://f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" gracePeriod=30 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.339686 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948501 4816 generic.go:334] "Generic (PLEG): container finished" podID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerID="692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" exitCode=0 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948548 4816 generic.go:334] "Generic (PLEG): container finished" podID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerID="f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" exitCode=2 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948561 4816 generic.go:334] "Generic (PLEG): container finished" podID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerID="f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" exitCode=0 Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948643 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948681 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.948695 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.966270 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerStarted","Data":"373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835"} Mar 11 12:21:15 crc kubenswrapper[4816]: I0311 12:21:15.968330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:16 crc kubenswrapper[4816]: E0311 12:21:16.095383 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod940c2849_ce30_473c_9a55_b4fc35309bb7.slice/crio-5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.865800 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" podStartSLOduration=3.865776857 podStartE2EDuration="3.865776857s" podCreationTimestamp="2026-03-11 12:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:16.002745143 +0000 UTC m=+1362.594009100" watchObservedRunningTime="2026-03-11 12:21:16.865776857 +0000 UTC m=+1363.457040824" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.872554 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.872864 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" containerID="cri-o://ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" gracePeriod=30 Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.873450 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" containerID="cri-o://e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" gracePeriod=30 Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.887835 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926271 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926441 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926486 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926572 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926650 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.926676 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.934316 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.934430 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") pod \"940c2849-ce30-473c-9a55-b4fc35309bb7\" (UID: \"940c2849-ce30-473c-9a55-b4fc35309bb7\") " Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.936646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.937102 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.943817 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh" (OuterVolumeSpecName: "kube-api-access-8x2sh") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "kube-api-access-8x2sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.950944 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts" (OuterVolumeSpecName: "scripts") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.992813 4816 generic.go:334] "Generic (PLEG): container finished" podID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerID="5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" exitCode=0 Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.993344 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.994144 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b"} Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.994183 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"940c2849-ce30-473c-9a55-b4fc35309bb7","Type":"ContainerDied","Data":"27271a41ca8a8c294fc6aef8e91d6f6174e9af95c25d74da01a3a36774d13fba"} Mar 11 12:21:16 crc kubenswrapper[4816]: I0311 12:21:16.994205 4816 scope.go:117] "RemoveContainer" containerID="692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.012501 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038758 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038800 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038811 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038824 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940c2849-ce30-473c-9a55-b4fc35309bb7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.038832 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x2sh\" (UniqueName: \"kubernetes.io/projected/940c2849-ce30-473c-9a55-b4fc35309bb7-kube-api-access-8x2sh\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.063120 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.070188 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.080820 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data" (OuterVolumeSpecName: "config-data") pod "940c2849-ce30-473c-9a55-b4fc35309bb7" (UID: "940c2849-ce30-473c-9a55-b4fc35309bb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.116765 4816 scope.go:117] "RemoveContainer" containerID="f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.141152 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.141324 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.141418 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940c2849-ce30-473c-9a55-b4fc35309bb7-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.141224 4816 scope.go:117] "RemoveContainer" containerID="5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.168506 4816 scope.go:117] "RemoveContainer" containerID="f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.193459 4816 scope.go:117] "RemoveContainer" containerID="692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.194080 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763\": container with ID starting with 692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763 not found: ID does not exist" containerID="692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.194136 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763"} err="failed to get container status \"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763\": rpc error: code = NotFound desc = could not find container \"692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763\": container with ID starting with 692a59aab8ed0fb0929a6f8bf9b1c8bc207d5d1a602944a57acd9df588a34763 not found: ID does not exist" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.194169 4816 scope.go:117] "RemoveContainer" containerID="f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.194735 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec\": container with ID starting with f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec not found: ID does not exist" containerID="f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.194860 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec"} err="failed to get container status \"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec\": rpc error: code = NotFound desc = could not find container \"f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec\": container with ID starting with f66037d7df88f58d61e86637f837290145123e6de95ebca3b16d8bb882106aec not found: ID does not exist" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.194973 4816 scope.go:117] "RemoveContainer" containerID="5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.195365 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b\": container with ID starting with 5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b not found: ID does not exist" containerID="5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.195410 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b"} err="failed to get container status \"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b\": rpc error: code = NotFound desc = could not find container \"5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b\": container with ID starting with 5e35557f09b19997180c50735b550f53beb23be40ddd5bfc3e12122521653c8b not found: ID does not exist" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.195471 4816 scope.go:117] "RemoveContainer" containerID="f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.196000 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff\": container with ID starting with f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff not found: ID does not exist" containerID="f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.196021 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff"} err="failed to get container status \"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff\": rpc error: code = NotFound desc = could not find container \"f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff\": container with ID starting with f34f311dd5e09eaf0d55b74ce05a1c2288151feba906d4592da42d922bbba2ff not found: ID does not exist" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.333757 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.346163 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366146 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.366716 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366741 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.366752 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-central-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366759 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-central-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.366786 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="sg-core" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366793 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="sg-core" Mar 11 12:21:17 crc kubenswrapper[4816]: E0311 12:21:17.366813 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-notification-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.366820 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-notification-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.367012 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-notification-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.367025 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="sg-core" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.367036 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="ceilometer-central-agent" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.367044 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" containerName="proxy-httpd" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.369037 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.376281 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.376565 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.376914 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.392082 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.447748 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.447857 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448068 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448190 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448383 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448470 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448708 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.448944 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.550956 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551030 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551054 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551093 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551120 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551188 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551304 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.551375 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.552211 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.552299 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.558086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.558375 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.560378 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.560715 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.562001 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.572872 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") pod \"ceilometer-0\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " pod="openstack/ceilometer-0" Mar 11 12:21:17 crc kubenswrapper[4816]: I0311 12:21:17.706905 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:18 crc kubenswrapper[4816]: I0311 12:21:18.006365 4816 generic.go:334] "Generic (PLEG): container finished" podID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerID="ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" exitCode=143 Mar 11 12:21:18 crc kubenswrapper[4816]: I0311 12:21:18.006473 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerDied","Data":"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9"} Mar 11 12:21:18 crc kubenswrapper[4816]: I0311 12:21:18.145093 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940c2849-ce30-473c-9a55-b4fc35309bb7" path="/var/lib/kubelet/pods/940c2849-ce30-473c-9a55-b4fc35309bb7/volumes" Mar 11 12:21:18 crc kubenswrapper[4816]: I0311 12:21:18.262831 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:18 crc kubenswrapper[4816]: W0311 12:21:18.263275 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod389a1019_c47b_449b_ac46_f0271ba70c0b.slice/crio-831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778 WatchSource:0}: Error finding container 831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778: Status 404 returned error can't find the container with id 831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778 Mar 11 12:21:19 crc kubenswrapper[4816]: I0311 12:21:19.029026 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778"} Mar 11 12:21:19 crc kubenswrapper[4816]: I0311 12:21:19.478370 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:19 crc kubenswrapper[4816]: I0311 12:21:19.555478 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:19 crc kubenswrapper[4816]: I0311 12:21:19.593214 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.043170 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9"} Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.043671 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea"} Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.067010 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.362334 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.365944 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.370839 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.371928 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.382707 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.425892 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.426380 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.426453 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.426499 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.528311 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.528446 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.528482 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.528550 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.533286 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.535900 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.537680 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.548023 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.554330 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") pod \"nova-cell1-cell-mapping-wsfdf\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.630712 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") pod \"8b44498c-88b3-42e4-b8cd-322579c29a3e\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.630886 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") pod \"8b44498c-88b3-42e4-b8cd-322579c29a3e\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.631028 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") pod \"8b44498c-88b3-42e4-b8cd-322579c29a3e\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.631157 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") pod \"8b44498c-88b3-42e4-b8cd-322579c29a3e\" (UID: \"8b44498c-88b3-42e4-b8cd-322579c29a3e\") " Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.634922 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs" (OuterVolumeSpecName: "logs") pod "8b44498c-88b3-42e4-b8cd-322579c29a3e" (UID: "8b44498c-88b3-42e4-b8cd-322579c29a3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.646582 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf" (OuterVolumeSpecName: "kube-api-access-42kkf") pod "8b44498c-88b3-42e4-b8cd-322579c29a3e" (UID: "8b44498c-88b3-42e4-b8cd-322579c29a3e"). InnerVolumeSpecName "kube-api-access-42kkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.667922 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data" (OuterVolumeSpecName: "config-data") pod "8b44498c-88b3-42e4-b8cd-322579c29a3e" (UID: "8b44498c-88b3-42e4-b8cd-322579c29a3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.681423 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b44498c-88b3-42e4-b8cd-322579c29a3e" (UID: "8b44498c-88b3-42e4-b8cd-322579c29a3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.734160 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b44498c-88b3-42e4-b8cd-322579c29a3e-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.734194 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.734205 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42kkf\" (UniqueName: \"kubernetes.io/projected/8b44498c-88b3-42e4-b8cd-322579c29a3e-kube-api-access-42kkf\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.734216 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b44498c-88b3-42e4-b8cd-322579c29a3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:20 crc kubenswrapper[4816]: I0311 12:21:20.808679 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.073056 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077389 4816 generic.go:334] "Generic (PLEG): container finished" podID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerID="e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" exitCode=0 Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077505 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077510 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerDied","Data":"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077586 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b44498c-88b3-42e4-b8cd-322579c29a3e","Type":"ContainerDied","Data":"3161af8bf1555f75f7e3fe8c5b6c7028f30e608b5088c5375d09c6a61566d4c9"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.077613 4816 scope.go:117] "RemoveContainer" containerID="e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.120388 4816 scope.go:117] "RemoveContainer" containerID="ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.145343 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.174905 4816 scope.go:117] "RemoveContainer" containerID="e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" Mar 11 12:21:22 crc kubenswrapper[4816]: E0311 12:21:21.175467 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a\": container with ID starting with e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a not found: ID does not exist" containerID="e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.175521 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a"} err="failed to get container status \"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a\": rpc error: code = NotFound desc = could not find container \"e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a\": container with ID starting with e810fb5b31af6c475fbc24ea8c5aeb1d4e563c63735517ab9297f7db17e0ca5a not found: ID does not exist" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.175566 4816 scope.go:117] "RemoveContainer" containerID="ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" Mar 11 12:21:22 crc kubenswrapper[4816]: E0311 12:21:21.175874 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9\": container with ID starting with ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9 not found: ID does not exist" containerID="ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.175889 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9"} err="failed to get container status \"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9\": rpc error: code = NotFound desc = could not find container \"ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9\": container with ID starting with ebc1320569afb95200b6afc79a82f2129b48fb5a36a78a5a136bc96434e738f9 not found: ID does not exist" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.176712 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.187354 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:22 crc kubenswrapper[4816]: E0311 12:21:21.187971 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.187991 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" Mar 11 12:21:22 crc kubenswrapper[4816]: E0311 12:21:21.188033 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.188040 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.188287 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-api" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.188308 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" containerName="nova-api-log" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.189553 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.195342 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.245727 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.245783 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.245813 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.245840 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.246038 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.246074 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.348653 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349152 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349183 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349281 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.349316 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.350510 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.363730 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.381945 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.382240 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.382409 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.393740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.395861 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.396333 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.416918 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") pod \"nova-api-0\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.462855 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:21.507928 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.103392 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wsfdf" event={"ID":"36fadc66-c846-46c0-a002-efeb7656f2b8","Type":"ContainerStarted","Data":"adf484d20700d25957189d351eb669acaae4683a20326267761afe30c6a7e50c"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.103907 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wsfdf" event={"ID":"36fadc66-c846-46c0-a002-efeb7656f2b8","Type":"ContainerStarted","Data":"f1ca638575d3d6823fa339abfb04a9bb46bfeaa2c8671cd04523b6370d4416be"} Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.126533 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wsfdf" podStartSLOduration=2.12651104 podStartE2EDuration="2.12651104s" podCreationTimestamp="2026-03-11 12:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:22.124447441 +0000 UTC m=+1368.715711408" watchObservedRunningTime="2026-03-11 12:21:22.12651104 +0000 UTC m=+1368.717775007" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.145591 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b44498c-88b3-42e4-b8cd-322579c29a3e" path="/var/lib/kubelet/pods/8b44498c-88b3-42e4-b8cd-322579c29a3e/volumes" Mar 11 12:21:22 crc kubenswrapper[4816]: I0311 12:21:22.501071 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.127349 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerStarted","Data":"d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441"} Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128175 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-central-agent" containerID="cri-o://86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea" gracePeriod=30 Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128374 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="proxy-httpd" containerID="cri-o://d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441" gracePeriod=30 Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128409 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-notification-agent" containerID="cri-o://3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9" gracePeriod=30 Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128491 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="sg-core" containerID="cri-o://189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125" gracePeriod=30 Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.128579 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.134771 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerStarted","Data":"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381"} Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.134855 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerStarted","Data":"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a"} Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.134878 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerStarted","Data":"1479f433dd0af53602fa7b4358ac16a8893fc6ce4f3fc3758931ae0187bafc3e"} Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.174542 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.851458799 podStartE2EDuration="6.174513366s" podCreationTimestamp="2026-03-11 12:21:17 +0000 UTC" firstStartedPulling="2026-03-11 12:21:18.266427899 +0000 UTC m=+1364.857691866" lastFinishedPulling="2026-03-11 12:21:22.589482466 +0000 UTC m=+1369.180746433" observedRunningTime="2026-03-11 12:21:23.158831786 +0000 UTC m=+1369.750095773" watchObservedRunningTime="2026-03-11 12:21:23.174513366 +0000 UTC m=+1369.765777343" Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.196701 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.1966701730000002 podStartE2EDuration="2.196670173s" podCreationTimestamp="2026-03-11 12:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:23.188274292 +0000 UTC m=+1369.779538249" watchObservedRunningTime="2026-03-11 12:21:23.196670173 +0000 UTC m=+1369.787934140" Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.545695 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.634094 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:21:23 crc kubenswrapper[4816]: I0311 12:21:23.635015 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="dnsmasq-dns" containerID="cri-o://c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6" gracePeriod=10 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.152802 4816 generic.go:334] "Generic (PLEG): container finished" podID="1370549e-42a3-450d-a28d-47d4a0764f56" containerID="c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6" exitCode=0 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.166683 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerDied","Data":"c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.166741 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" event={"ID":"1370549e-42a3-450d-a28d-47d4a0764f56","Type":"ContainerDied","Data":"31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.166757 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31af9c3f8588e04ef95c2018485a2e29382e231f1e73609996971ecefe64ea0d" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175470 4816 generic.go:334] "Generic (PLEG): container finished" podID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerID="d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441" exitCode=0 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175515 4816 generic.go:334] "Generic (PLEG): container finished" podID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerID="189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125" exitCode=2 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175526 4816 generic.go:334] "Generic (PLEG): container finished" podID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerID="3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9" exitCode=0 Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175731 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175801 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.175814 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9"} Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.216842 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.343602 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344113 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344299 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344323 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344371 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.344444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") pod \"1370549e-42a3-450d-a28d-47d4a0764f56\" (UID: \"1370549e-42a3-450d-a28d-47d4a0764f56\") " Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.364481 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl" (OuterVolumeSpecName: "kube-api-access-jgzdl") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "kube-api-access-jgzdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.407934 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.425617 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.431097 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config" (OuterVolumeSpecName: "config") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.441657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447118 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447242 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447309 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447375 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgzdl\" (UniqueName: \"kubernetes.io/projected/1370549e-42a3-450d-a28d-47d4a0764f56-kube-api-access-jgzdl\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.447426 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.460183 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1370549e-42a3-450d-a28d-47d4a0764f56" (UID: "1370549e-42a3-450d-a28d-47d4a0764f56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:21:24 crc kubenswrapper[4816]: I0311 12:21:24.550043 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1370549e-42a3-450d-a28d-47d4a0764f56-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:25 crc kubenswrapper[4816]: I0311 12:21:25.187060 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-bsnbn" Mar 11 12:21:25 crc kubenswrapper[4816]: I0311 12:21:25.234348 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:21:25 crc kubenswrapper[4816]: I0311 12:21:25.248203 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-bsnbn"] Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.159850 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" path="/var/lib/kubelet/pods/1370549e-42a3-450d-a28d-47d4a0764f56/volumes" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.207870 4816 generic.go:334] "Generic (PLEG): container finished" podID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerID="86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea" exitCode=0 Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.208107 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea"} Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.513745 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596394 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596562 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596734 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596784 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596819 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596851 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.596879 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") pod \"389a1019-c47b-449b-ac46-f0271ba70c0b\" (UID: \"389a1019-c47b-449b-ac46-f0271ba70c0b\") " Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.597938 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.598171 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.608328 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm" (OuterVolumeSpecName: "kube-api-access-h2hpm") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "kube-api-access-h2hpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.616106 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts" (OuterVolumeSpecName: "scripts") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.633827 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.659437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.685783 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699862 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699919 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699943 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2hpm\" (UniqueName: \"kubernetes.io/projected/389a1019-c47b-449b-ac46-f0271ba70c0b-kube-api-access-h2hpm\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699980 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389a1019-c47b-449b-ac46-f0271ba70c0b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.699999 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.700018 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.700034 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.723540 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data" (OuterVolumeSpecName: "config-data") pod "389a1019-c47b-449b-ac46-f0271ba70c0b" (UID: "389a1019-c47b-449b-ac46-f0271ba70c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:26 crc kubenswrapper[4816]: I0311 12:21:26.802188 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389a1019-c47b-449b-ac46-f0271ba70c0b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.226285 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389a1019-c47b-449b-ac46-f0271ba70c0b","Type":"ContainerDied","Data":"831f866c847ed0c4a4e75849b87d63d375222a7a188ecc44b5169bd7010ae778"} Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.226773 4816 scope.go:117] "RemoveContainer" containerID="d506ef0b1a20cb0137c7e713819501bcf9a1c0dd99e6ec71affc5c1084fb1441" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.226377 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.264538 4816 scope.go:117] "RemoveContainer" containerID="189218a0b9eca174d0a87d53dc63a64ae6c4741afd3bf140d2544540d81d6125" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.290068 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.303267 4816 scope.go:117] "RemoveContainer" containerID="3c33e4e96ad95d72d477f29cdd83ed17043f7147e78c943eda376d648d31d9b9" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.305376 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324030 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324652 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="proxy-httpd" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324679 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="proxy-httpd" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324708 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-notification-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324718 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-notification-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324730 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="dnsmasq-dns" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324739 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="dnsmasq-dns" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324763 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-central-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324771 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-central-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324800 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="sg-core" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324808 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="sg-core" Mar 11 12:21:27 crc kubenswrapper[4816]: E0311 12:21:27.324825 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="init" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.324833 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="init" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325030 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="proxy-httpd" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325052 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1370549e-42a3-450d-a28d-47d4a0764f56" containerName="dnsmasq-dns" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325070 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-notification-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325080 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="ceilometer-central-agent" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.325099 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" containerName="sg-core" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.327778 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.330564 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.333767 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.337446 4816 scope.go:117] "RemoveContainer" containerID="86830a024f06c2328c22d5b921acb795fe3b73a3eede1e8d875dfee3806bd2ea" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.340763 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.363958 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417268 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417348 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417379 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417396 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417430 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417466 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417896 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.417995 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519624 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519771 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519805 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519827 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519883 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519899 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.519929 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.521471 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.521494 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.525921 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.525923 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.527626 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.527645 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.541605 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.543419 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") pod \"ceilometer-0\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " pod="openstack/ceilometer-0" Mar 11 12:21:27 crc kubenswrapper[4816]: I0311 12:21:27.650419 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.159923 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389a1019-c47b-449b-ac46-f0271ba70c0b" path="/var/lib/kubelet/pods/389a1019-c47b-449b-ac46-f0271ba70c0b/volumes" Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.163845 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.244990 4816 generic.go:334] "Generic (PLEG): container finished" podID="36fadc66-c846-46c0-a002-efeb7656f2b8" containerID="adf484d20700d25957189d351eb669acaae4683a20326267761afe30c6a7e50c" exitCode=0 Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.245303 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wsfdf" event={"ID":"36fadc66-c846-46c0-a002-efeb7656f2b8","Type":"ContainerDied","Data":"adf484d20700d25957189d351eb669acaae4683a20326267761afe30c6a7e50c"} Mar 11 12:21:28 crc kubenswrapper[4816]: I0311 12:21:28.247798 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"c1f12afb3ed2335d5b28ac089b50b4a7d4f0e38f3d3c1e7e1f537108eabd58b9"} Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.272049 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015"} Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.720027 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.785030 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") pod \"36fadc66-c846-46c0-a002-efeb7656f2b8\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.785302 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") pod \"36fadc66-c846-46c0-a002-efeb7656f2b8\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.785339 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") pod \"36fadc66-c846-46c0-a002-efeb7656f2b8\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.785372 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") pod \"36fadc66-c846-46c0-a002-efeb7656f2b8\" (UID: \"36fadc66-c846-46c0-a002-efeb7656f2b8\") " Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.791234 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl" (OuterVolumeSpecName: "kube-api-access-v7hxl") pod "36fadc66-c846-46c0-a002-efeb7656f2b8" (UID: "36fadc66-c846-46c0-a002-efeb7656f2b8"). InnerVolumeSpecName "kube-api-access-v7hxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.814896 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts" (OuterVolumeSpecName: "scripts") pod "36fadc66-c846-46c0-a002-efeb7656f2b8" (UID: "36fadc66-c846-46c0-a002-efeb7656f2b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.846379 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data" (OuterVolumeSpecName: "config-data") pod "36fadc66-c846-46c0-a002-efeb7656f2b8" (UID: "36fadc66-c846-46c0-a002-efeb7656f2b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.846488 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36fadc66-c846-46c0-a002-efeb7656f2b8" (UID: "36fadc66-c846-46c0-a002-efeb7656f2b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.888045 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.888086 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.888101 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fadc66-c846-46c0-a002-efeb7656f2b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:29 crc kubenswrapper[4816]: I0311 12:21:29.888113 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7hxl\" (UniqueName: \"kubernetes.io/projected/36fadc66-c846-46c0-a002-efeb7656f2b8-kube-api-access-v7hxl\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.294899 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242"} Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.297232 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wsfdf" event={"ID":"36fadc66-c846-46c0-a002-efeb7656f2b8","Type":"ContainerDied","Data":"f1ca638575d3d6823fa339abfb04a9bb46bfeaa2c8671cd04523b6370d4416be"} Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.297284 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1ca638575d3d6823fa339abfb04a9bb46bfeaa2c8671cd04523b6370d4416be" Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.297376 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wsfdf" Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.454144 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.454504 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-log" containerID="cri-o://744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" gracePeriod=30 Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.454542 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-api" containerID="cri-o://fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" gracePeriod=30 Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.479630 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.482417 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" containerName="nova-scheduler-scheduler" containerID="cri-o://4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d" gracePeriod=30 Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.596318 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.597167 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" containerID="cri-o://8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" gracePeriod=30 Mar 11 12:21:30 crc kubenswrapper[4816]: I0311 12:21:30.597411 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" containerID="cri-o://65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" gracePeriod=30 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.158782 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.215897 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.215988 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.216130 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.216213 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.216363 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.216441 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") pod \"7279e91c-fd54-4a52-a247-c5e38a231907\" (UID: \"7279e91c-fd54-4a52-a247-c5e38a231907\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.219021 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs" (OuterVolumeSpecName: "logs") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.225843 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5" (OuterVolumeSpecName: "kube-api-access-w8dh5") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "kube-api-access-w8dh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.257683 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data" (OuterVolumeSpecName: "config-data") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.259597 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.295507 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.297340 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7279e91c-fd54-4a52-a247-c5e38a231907" (UID: "7279e91c-fd54-4a52-a247-c5e38a231907"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.312303 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.319556 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.319914 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.319993 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.320113 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7279e91c-fd54-4a52-a247-c5e38a231907-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.320186 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7279e91c-fd54-4a52-a247-c5e38a231907-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.320419 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dh5\" (UniqueName: \"kubernetes.io/projected/7279e91c-fd54-4a52-a247-c5e38a231907-kube-api-access-w8dh5\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.323120 4816 generic.go:334] "Generic (PLEG): container finished" podID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerID="8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" exitCode=143 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.323202 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerDied","Data":"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.328530 4816 generic.go:334] "Generic (PLEG): container finished" podID="49c3f447-334e-4147-b877-22a0ce6e3345" containerID="4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d" exitCode=0 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.328626 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49c3f447-334e-4147-b877-22a0ce6e3345","Type":"ContainerDied","Data":"4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334310 4816 generic.go:334] "Generic (PLEG): container finished" podID="7279e91c-fd54-4a52-a247-c5e38a231907" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" exitCode=0 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334336 4816 generic.go:334] "Generic (PLEG): container finished" podID="7279e91c-fd54-4a52-a247-c5e38a231907" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" exitCode=143 Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334356 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerDied","Data":"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334380 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerDied","Data":"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334383 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334402 4816 scope.go:117] "RemoveContainer" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.334390 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7279e91c-fd54-4a52-a247-c5e38a231907","Type":"ContainerDied","Data":"1479f433dd0af53602fa7b4358ac16a8893fc6ce4f3fc3758931ae0187bafc3e"} Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.422938 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.435269 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.446389 4816 scope.go:117] "RemoveContainer" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.452983 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.453548 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-log" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453568 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-log" Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.453594 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fadc66-c846-46c0-a002-efeb7656f2b8" containerName="nova-manage" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453601 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fadc66-c846-46c0-a002-efeb7656f2b8" containerName="nova-manage" Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.453622 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-api" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453628 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-api" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453877 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-api" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453907 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fadc66-c846-46c0-a002-efeb7656f2b8" containerName="nova-manage" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.453922 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" containerName="nova-api-log" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.455226 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.460069 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.460396 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.460542 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.475462 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.506324 4816 scope.go:117] "RemoveContainer" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.510139 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": container with ID starting with fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381 not found: ID does not exist" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.510217 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381"} err="failed to get container status \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": rpc error: code = NotFound desc = could not find container \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": container with ID starting with fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381 not found: ID does not exist" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.510275 4816 scope.go:117] "RemoveContainer" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" Mar 11 12:21:31 crc kubenswrapper[4816]: E0311 12:21:31.512323 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": container with ID starting with 744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a not found: ID does not exist" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.512374 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a"} err="failed to get container status \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": rpc error: code = NotFound desc = could not find container \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": container with ID starting with 744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a not found: ID does not exist" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.512398 4816 scope.go:117] "RemoveContainer" containerID="fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.513025 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381"} err="failed to get container status \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": rpc error: code = NotFound desc = could not find container \"fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381\": container with ID starting with fa6e70cb872c0bb963f57dccb1e8cfa3a80de411767619fafe0156ebe1500381 not found: ID does not exist" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.513077 4816 scope.go:117] "RemoveContainer" containerID="744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.513432 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a"} err="failed to get container status \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": rpc error: code = NotFound desc = could not find container \"744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a\": container with ID starting with 744232eb02ed0a62b4c386367a237348d207c0352aee779d03b38dd46cecc95a not found: ID does not exist" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525050 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525205 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525241 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525382 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525419 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.525460 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.627449 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628175 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628272 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628304 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628344 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.628363 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.629456 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.632512 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.632946 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.634272 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.636907 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.660722 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") pod \"nova-api-0\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.703481 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.735854 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") pod \"49c3f447-334e-4147-b877-22a0ce6e3345\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.736235 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") pod \"49c3f447-334e-4147-b877-22a0ce6e3345\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.736515 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") pod \"49c3f447-334e-4147-b877-22a0ce6e3345\" (UID: \"49c3f447-334e-4147-b877-22a0ce6e3345\") " Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.743331 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9" (OuterVolumeSpecName: "kube-api-access-84md9") pod "49c3f447-334e-4147-b877-22a0ce6e3345" (UID: "49c3f447-334e-4147-b877-22a0ce6e3345"). InnerVolumeSpecName "kube-api-access-84md9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.770734 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data" (OuterVolumeSpecName: "config-data") pod "49c3f447-334e-4147-b877-22a0ce6e3345" (UID: "49c3f447-334e-4147-b877-22a0ce6e3345"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.787631 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49c3f447-334e-4147-b877-22a0ce6e3345" (UID: "49c3f447-334e-4147-b877-22a0ce6e3345"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.794218 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.842129 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.842172 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c3f447-334e-4147-b877-22a0ce6e3345-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:31 crc kubenswrapper[4816]: I0311 12:21:31.842182 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84md9\" (UniqueName: \"kubernetes.io/projected/49c3f447-334e-4147-b877-22a0ce6e3345-kube-api-access-84md9\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.141952 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7279e91c-fd54-4a52-a247-c5e38a231907" path="/var/lib/kubelet/pods/7279e91c-fd54-4a52-a247-c5e38a231907/volumes" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.277242 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: W0311 12:21:32.289500 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28745d2_082d_4c99_90f0_b6c4696fb1a2.slice/crio-8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014 WatchSource:0}: Error finding container 8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014: Status 404 returned error can't find the container with id 8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014 Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.351458 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerStarted","Data":"8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014"} Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.354018 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49c3f447-334e-4147-b877-22a0ce6e3345","Type":"ContainerDied","Data":"80c23e1f7785724059b85c847854192b3471a718a42ed80849445c1edfb1f7c4"} Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.354106 4816 scope.go:117] "RemoveContainer" containerID="4ce5b26a7642dbb3c4b2d4c21f23040b0afe51a33212a23837a87602f659ac7d" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.354145 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.398897 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.415516 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.428191 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: E0311 12:21:32.428785 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" containerName="nova-scheduler-scheduler" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.429428 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" containerName="nova-scheduler-scheduler" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.430226 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" containerName="nova-scheduler-scheduler" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.431405 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.439218 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.445745 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.579632 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.579875 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.579926 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.682472 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.683072 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.683097 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.690416 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.690844 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.701917 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") pod \"nova-scheduler-0\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " pod="openstack/nova-scheduler-0" Mar 11 12:21:32 crc kubenswrapper[4816]: I0311 12:21:32.756336 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.265603 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:21:33 crc kubenswrapper[4816]: W0311 12:21:33.271926 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41f4b502_b85f_488c_b55b_27a31479df68.slice/crio-0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440 WatchSource:0}: Error finding container 0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440: Status 404 returned error can't find the container with id 0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440 Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.370780 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"41f4b502-b85f-488c-b55b-27a31479df68","Type":"ContainerStarted","Data":"0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440"} Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.375412 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerStarted","Data":"a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809"} Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.376158 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.381541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerStarted","Data":"f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d"} Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.381599 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerStarted","Data":"9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de"} Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.419823 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.643419558 podStartE2EDuration="6.419803313s" podCreationTimestamp="2026-03-11 12:21:27 +0000 UTC" firstStartedPulling="2026-03-11 12:21:28.174205934 +0000 UTC m=+1374.765469911" lastFinishedPulling="2026-03-11 12:21:32.950589699 +0000 UTC m=+1379.541853666" observedRunningTime="2026-03-11 12:21:33.404818773 +0000 UTC m=+1379.996082760" watchObservedRunningTime="2026-03-11 12:21:33.419803313 +0000 UTC m=+1380.011067280" Mar 11 12:21:33 crc kubenswrapper[4816]: I0311 12:21:33.442717 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.442692891 podStartE2EDuration="2.442692891s" podCreationTimestamp="2026-03-11 12:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:33.437497982 +0000 UTC m=+1380.028761969" watchObservedRunningTime="2026-03-11 12:21:33.442692891 +0000 UTC m=+1380.033956848" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.155502 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c3f447-334e-4147-b877-22a0ce6e3345" path="/var/lib/kubelet/pods/49c3f447-334e-4147-b877-22a0ce6e3345/volumes" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.287625 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.328600 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.328714 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.329225 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.329383 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.329423 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") pod \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\" (UID: \"3b8751c6-ef60-400a-b4e3-0042d63c2d83\") " Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.335979 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs" (OuterVolumeSpecName: "logs") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.363517 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx" (OuterVolumeSpecName: "kube-api-access-bzfsx") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "kube-api-access-bzfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.403934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"41f4b502-b85f-488c-b55b-27a31479df68","Type":"ContainerStarted","Data":"60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744"} Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.414988 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data" (OuterVolumeSpecName: "config-data") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417121 4816 generic.go:334] "Generic (PLEG): container finished" podID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerID="65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" exitCode=0 Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417438 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerDied","Data":"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a"} Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417501 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b8751c6-ef60-400a-b4e3-0042d63c2d83","Type":"ContainerDied","Data":"3d6f4a92fab1ae4820eecc5176239bbe544418957d5b8b49929c39dc6ee8800c"} Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417524 4816 scope.go:117] "RemoveContainer" containerID="65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.417644 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.420039 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.435432 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8751c6-ef60-400a-b4e3-0042d63c2d83-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.436374 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzfsx\" (UniqueName: \"kubernetes.io/projected/3b8751c6-ef60-400a-b4e3-0042d63c2d83-kube-api-access-bzfsx\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.436459 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.436914 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.460513 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.4604915800000002 podStartE2EDuration="2.46049158s" podCreationTimestamp="2026-03-11 12:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:34.437949602 +0000 UTC m=+1381.029213569" watchObservedRunningTime="2026-03-11 12:21:34.46049158 +0000 UTC m=+1381.051755547" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.467237 4816 scope.go:117] "RemoveContainer" containerID="8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.470954 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3b8751c6-ef60-400a-b4e3-0042d63c2d83" (UID: "3b8751c6-ef60-400a-b4e3-0042d63c2d83"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.503582 4816 scope.go:117] "RemoveContainer" containerID="65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" Mar 11 12:21:34 crc kubenswrapper[4816]: E0311 12:21:34.504161 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a\": container with ID starting with 65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a not found: ID does not exist" containerID="65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.504205 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a"} err="failed to get container status \"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a\": rpc error: code = NotFound desc = could not find container \"65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a\": container with ID starting with 65a6f4699baa07bb2587be04129499415c2d2f177b58bb5a282876ee282e965a not found: ID does not exist" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.504232 4816 scope.go:117] "RemoveContainer" containerID="8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" Mar 11 12:21:34 crc kubenswrapper[4816]: E0311 12:21:34.504588 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097\": container with ID starting with 8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097 not found: ID does not exist" containerID="8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.504654 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097"} err="failed to get container status \"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097\": rpc error: code = NotFound desc = could not find container \"8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097\": container with ID starting with 8e786968694e08cb0fddd905eefb0efe274d795cc622741f10ed840a98693097 not found: ID does not exist" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.539454 4816 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8751c6-ef60-400a-b4e3-0042d63c2d83-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.760303 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.776804 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.803654 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:34 crc kubenswrapper[4816]: E0311 12:21:34.804328 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.804352 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" Mar 11 12:21:34 crc kubenswrapper[4816]: E0311 12:21:34.804391 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.804399 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.804643 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-metadata" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.804674 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" containerName="nova-metadata-log" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.806026 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.808408 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.810601 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.810879 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.851400 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.851887 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.852016 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.852183 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.852315 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954280 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954357 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954419 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954571 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.954615 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.957335 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.970525 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.970971 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.971972 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:34 crc kubenswrapper[4816]: I0311 12:21:34.973774 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") pod \"nova-metadata-0\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " pod="openstack/nova-metadata-0" Mar 11 12:21:35 crc kubenswrapper[4816]: I0311 12:21:35.127771 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:21:35 crc kubenswrapper[4816]: I0311 12:21:35.610243 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.147226 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b8751c6-ef60-400a-b4e3-0042d63c2d83" path="/var/lib/kubelet/pods/3b8751c6-ef60-400a-b4e3-0042d63c2d83/volumes" Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.439595 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerStarted","Data":"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2"} Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.439686 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerStarted","Data":"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191"} Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.439706 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerStarted","Data":"c81825bf2b4be781ea36bdb64201016c8530a7353fa6d58d50264ccf72608bde"} Mar 11 12:21:36 crc kubenswrapper[4816]: I0311 12:21:36.479766 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.479738296 podStartE2EDuration="2.479738296s" podCreationTimestamp="2026-03-11 12:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 12:21:36.466752364 +0000 UTC m=+1383.058016331" watchObservedRunningTime="2026-03-11 12:21:36.479738296 +0000 UTC m=+1383.071002263" Mar 11 12:21:37 crc kubenswrapper[4816]: I0311 12:21:37.757197 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 12:21:39 crc kubenswrapper[4816]: I0311 12:21:39.515831 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:21:39 crc kubenswrapper[4816]: I0311 12:21:39.516829 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:21:40 crc kubenswrapper[4816]: I0311 12:21:40.128603 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:21:40 crc kubenswrapper[4816]: I0311 12:21:40.128769 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.556934 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.559842 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.580723 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.635611 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.635975 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.636119 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.738352 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.738419 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.738501 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.738920 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.739011 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.759889 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") pod \"redhat-operators-c9c6p\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.795761 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.795839 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 12:21:41 crc kubenswrapper[4816]: I0311 12:21:41.889803 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.397485 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.530610 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerStarted","Data":"dde17337a3d028447d9ee51ec451117399c6017330c2d2d25cb0f2b2b3ec87e9"} Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.758017 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.796349 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.817592 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:42 crc kubenswrapper[4816]: I0311 12:21:42.817646 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:43 crc kubenswrapper[4816]: I0311 12:21:43.546212 4816 generic.go:334] "Generic (PLEG): container finished" podID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerID="61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0" exitCode=0 Mar 11 12:21:43 crc kubenswrapper[4816]: I0311 12:21:43.546290 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerDied","Data":"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0"} Mar 11 12:21:43 crc kubenswrapper[4816]: I0311 12:21:43.549951 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:21:43 crc kubenswrapper[4816]: I0311 12:21:43.589164 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 12:21:45 crc kubenswrapper[4816]: I0311 12:21:45.129078 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 12:21:45 crc kubenswrapper[4816]: I0311 12:21:45.129592 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 12:21:45 crc kubenswrapper[4816]: I0311 12:21:45.581356 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerStarted","Data":"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04"} Mar 11 12:21:46 crc kubenswrapper[4816]: I0311 12:21:46.139472 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:46 crc kubenswrapper[4816]: I0311 12:21:46.139564 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 12:21:46 crc kubenswrapper[4816]: I0311 12:21:46.597543 4816 generic.go:334] "Generic (PLEG): container finished" podID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerID="351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04" exitCode=0 Mar 11 12:21:46 crc kubenswrapper[4816]: I0311 12:21:46.597625 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerDied","Data":"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04"} Mar 11 12:21:47 crc kubenswrapper[4816]: I0311 12:21:47.614166 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerStarted","Data":"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b"} Mar 11 12:21:47 crc kubenswrapper[4816]: I0311 12:21:47.645178 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c9c6p" podStartSLOduration=3.1857693129999998 podStartE2EDuration="6.645148207s" podCreationTimestamp="2026-03-11 12:21:41 +0000 UTC" firstStartedPulling="2026-03-11 12:21:43.549435199 +0000 UTC m=+1390.140699176" lastFinishedPulling="2026-03-11 12:21:47.008814103 +0000 UTC m=+1393.600078070" observedRunningTime="2026-03-11 12:21:47.640430322 +0000 UTC m=+1394.231694339" watchObservedRunningTime="2026-03-11 12:21:47.645148207 +0000 UTC m=+1394.236412214" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.808302 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.809558 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.811788 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.820149 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.890214 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:51 crc kubenswrapper[4816]: I0311 12:21:51.890330 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:21:52 crc kubenswrapper[4816]: I0311 12:21:52.679873 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 12:21:52 crc kubenswrapper[4816]: I0311 12:21:52.687256 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 12:21:52 crc kubenswrapper[4816]: I0311 12:21:52.959241 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c9c6p" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" probeResult="failure" output=< Mar 11 12:21:52 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:21:52 crc kubenswrapper[4816]: > Mar 11 12:21:55 crc kubenswrapper[4816]: I0311 12:21:55.135535 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 12:21:55 crc kubenswrapper[4816]: I0311 12:21:55.136218 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 12:21:55 crc kubenswrapper[4816]: I0311 12:21:55.141591 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 12:21:55 crc kubenswrapper[4816]: I0311 12:21:55.142713 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 12:21:57 crc kubenswrapper[4816]: I0311 12:21:57.669243 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.163216 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.165850 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.169511 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.171091 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.171812 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.183643 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.283597 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") pod \"auto-csr-approver-29553862-ldg69\" (UID: \"787da494-4b4f-4a96-9e39-45179c456dc0\") " pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.387205 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") pod \"auto-csr-approver-29553862-ldg69\" (UID: \"787da494-4b4f-4a96-9e39-45179c456dc0\") " pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.426283 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") pod \"auto-csr-approver-29553862-ldg69\" (UID: \"787da494-4b4f-4a96-9e39-45179c456dc0\") " pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:00 crc kubenswrapper[4816]: I0311 12:22:00.491102 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:01 crc kubenswrapper[4816]: W0311 12:22:01.033369 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod787da494_4b4f_4a96_9e39_45179c456dc0.slice/crio-d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8 WatchSource:0}: Error finding container d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8: Status 404 returned error can't find the container with id d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8 Mar 11 12:22:01 crc kubenswrapper[4816]: I0311 12:22:01.045630 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:22:01 crc kubenswrapper[4816]: I0311 12:22:01.779046 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553862-ldg69" event={"ID":"787da494-4b4f-4a96-9e39-45179c456dc0","Type":"ContainerStarted","Data":"d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8"} Mar 11 12:22:01 crc kubenswrapper[4816]: I0311 12:22:01.960473 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:22:02 crc kubenswrapper[4816]: I0311 12:22:02.028089 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:22:02 crc kubenswrapper[4816]: I0311 12:22:02.205962 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:22:02 crc kubenswrapper[4816]: I0311 12:22:02.793379 4816 generic.go:334] "Generic (PLEG): container finished" podID="787da494-4b4f-4a96-9e39-45179c456dc0" containerID="d0874559d26089e67dcd3126789f0cf0dc3ed1323323af96fe7e8ee67fbd532f" exitCode=0 Mar 11 12:22:02 crc kubenswrapper[4816]: I0311 12:22:02.793450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553862-ldg69" event={"ID":"787da494-4b4f-4a96-9e39-45179c456dc0","Type":"ContainerDied","Data":"d0874559d26089e67dcd3126789f0cf0dc3ed1323323af96fe7e8ee67fbd532f"} Mar 11 12:22:03 crc kubenswrapper[4816]: I0311 12:22:03.804541 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c9c6p" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" containerID="cri-o://07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" gracePeriod=2 Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.293466 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.301421 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.391861 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") pod \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.392074 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") pod \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.392223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") pod \"787da494-4b4f-4a96-9e39-45179c456dc0\" (UID: \"787da494-4b4f-4a96-9e39-45179c456dc0\") " Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.393119 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities" (OuterVolumeSpecName: "utilities") pod "7eab4337-089e-4a7c-b1b2-0d902c26f9bb" (UID: "7eab4337-089e-4a7c-b1b2-0d902c26f9bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.393343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") pod \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\" (UID: \"7eab4337-089e-4a7c-b1b2-0d902c26f9bb\") " Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.396421 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.410240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw" (OuterVolumeSpecName: "kube-api-access-m78hw") pod "7eab4337-089e-4a7c-b1b2-0d902c26f9bb" (UID: "7eab4337-089e-4a7c-b1b2-0d902c26f9bb"). InnerVolumeSpecName "kube-api-access-m78hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.419635 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b" (OuterVolumeSpecName: "kube-api-access-8vv9b") pod "787da494-4b4f-4a96-9e39-45179c456dc0" (UID: "787da494-4b4f-4a96-9e39-45179c456dc0"). InnerVolumeSpecName "kube-api-access-8vv9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.498316 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vv9b\" (UniqueName: \"kubernetes.io/projected/787da494-4b4f-4a96-9e39-45179c456dc0-kube-api-access-8vv9b\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.498353 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m78hw\" (UniqueName: \"kubernetes.io/projected/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-kube-api-access-m78hw\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.534557 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7eab4337-089e-4a7c-b1b2-0d902c26f9bb" (UID: "7eab4337-089e-4a7c-b1b2-0d902c26f9bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.600708 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eab4337-089e-4a7c-b1b2-0d902c26f9bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822416 4816 generic.go:334] "Generic (PLEG): container finished" podID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerID="07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" exitCode=0 Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822525 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerDied","Data":"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b"} Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822555 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9c6p" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822594 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9c6p" event={"ID":"7eab4337-089e-4a7c-b1b2-0d902c26f9bb","Type":"ContainerDied","Data":"dde17337a3d028447d9ee51ec451117399c6017330c2d2d25cb0f2b2b3ec87e9"} Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.822621 4816 scope.go:117] "RemoveContainer" containerID="07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.827993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553862-ldg69" event={"ID":"787da494-4b4f-4a96-9e39-45179c456dc0","Type":"ContainerDied","Data":"d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8"} Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.828551 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45894c0b9e08495461a2e93ed4ff33fe9f511a52146f27eba8ae8e2789a5bb8" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.828064 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553862-ldg69" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.866282 4816 scope.go:117] "RemoveContainer" containerID="351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.875816 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.886669 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c9c6p"] Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.905218 4816 scope.go:117] "RemoveContainer" containerID="61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.934495 4816 scope.go:117] "RemoveContainer" containerID="07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" Mar 11 12:22:04 crc kubenswrapper[4816]: E0311 12:22:04.935458 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b\": container with ID starting with 07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b not found: ID does not exist" containerID="07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.935529 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b"} err="failed to get container status \"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b\": rpc error: code = NotFound desc = could not find container \"07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b\": container with ID starting with 07cab6208632be720f7bd7e72d37d12b39444b2dbdbdd97343950ac444c3b44b not found: ID does not exist" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.935567 4816 scope.go:117] "RemoveContainer" containerID="351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04" Mar 11 12:22:04 crc kubenswrapper[4816]: E0311 12:22:04.935869 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04\": container with ID starting with 351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04 not found: ID does not exist" containerID="351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.935890 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04"} err="failed to get container status \"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04\": rpc error: code = NotFound desc = could not find container \"351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04\": container with ID starting with 351d1b2af6835e229f75039e47bb93ab25d68cb59a137fa24a8a5d6c16d38e04 not found: ID does not exist" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.935905 4816 scope.go:117] "RemoveContainer" containerID="61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0" Mar 11 12:22:04 crc kubenswrapper[4816]: E0311 12:22:04.936545 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0\": container with ID starting with 61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0 not found: ID does not exist" containerID="61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0" Mar 11 12:22:04 crc kubenswrapper[4816]: I0311 12:22:04.936574 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0"} err="failed to get container status \"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0\": rpc error: code = NotFound desc = could not find container \"61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0\": container with ID starting with 61b458c6b53b9f29ec8711e90f8d5237e5c40ab1fc6dcac7ce59ef8e8e0ce3d0 not found: ID does not exist" Mar 11 12:22:05 crc kubenswrapper[4816]: I0311 12:22:05.395365 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:22:05 crc kubenswrapper[4816]: I0311 12:22:05.411501 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553856-7k69r"] Mar 11 12:22:06 crc kubenswrapper[4816]: I0311 12:22:06.140808 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fb6b17-9d8a-4f10-8a93-a3e65f470a27" path="/var/lib/kubelet/pods/79fb6b17-9d8a-4f10-8a93-a3e65f470a27/volumes" Mar 11 12:22:06 crc kubenswrapper[4816]: I0311 12:22:06.141783 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" path="/var/lib/kubelet/pods/7eab4337-089e-4a7c-b1b2-0d902c26f9bb/volumes" Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.514571 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.515486 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.515552 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.516284 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.516571 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a" gracePeriod=600 Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.901577 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a" exitCode=0 Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.901665 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a"} Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.902206 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3"} Mar 11 12:22:09 crc kubenswrapper[4816]: I0311 12:22:09.902265 4816 scope.go:117] "RemoveContainer" containerID="92bc406893843c03ac9aa6138b10c838c501d62aa37baf4b9b92254baf796e96" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.856880 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:17 crc kubenswrapper[4816]: E0311 12:22:17.858925 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="extract-utilities" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.858998 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="extract-utilities" Mar 11 12:22:17 crc kubenswrapper[4816]: E0311 12:22:17.859068 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787da494-4b4f-4a96-9e39-45179c456dc0" containerName="oc" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="787da494-4b4f-4a96-9e39-45179c456dc0" containerName="oc" Mar 11 12:22:17 crc kubenswrapper[4816]: E0311 12:22:17.859187 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="extract-content" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859234 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="extract-content" Mar 11 12:22:17 crc kubenswrapper[4816]: E0311 12:22:17.859365 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859426 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859696 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="787da494-4b4f-4a96-9e39-45179c456dc0" containerName="oc" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.859766 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eab4337-089e-4a7c-b1b2-0d902c26f9bb" containerName="registry-server" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.860549 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.873861 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.901651 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:17 crc kubenswrapper[4816]: I0311 12:22:17.945292 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:17.997975 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4bcf-account-create-update-gkcsc"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.021980 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.022049 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.092890 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.093162 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="502b3843-8246-4715-9735-dfc0336caacb" containerName="openstackclient" containerID="cri-o://fd6533a10f6d22b4d1d7a2a73ad8cc4591438b77aefeced48dbf3b4526cf28f0" gracePeriod=2 Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.137117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.137497 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.138429 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.179108 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16" path="/var/lib/kubelet/pods/f4e2fceb-8b8c-44ee-a05b-ddb3e8ff4f16/volumes" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.180289 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.180341 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.180753 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502b3843-8246-4715-9735-dfc0336caacb" containerName="openstackclient" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.180781 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="502b3843-8246-4715-9735-dfc0336caacb" containerName="openstackclient" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.181532 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="502b3843-8246-4715-9735-dfc0336caacb" containerName="openstackclient" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.182331 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.195004 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") pod \"cinder-4bcf-account-create-update-nv5hk\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.205564 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.209014 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.215844 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.221876 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.235504 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.244589 4816 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.244666 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config podName:91cdfd54-2ee7-490e-bf3f-563406e59cda nodeName:}" failed. No retries permitted until 2026-03-11 12:22:18.744645431 +0000 UTC m=+1425.335909398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config") pod "ovn-controller-metrics-r8xbm" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda") : configmap "ovncontroller-metrics-config" not found Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.266977 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.347322 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.347394 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.347451 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.347486 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.360152 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-963f-account-create-update-w2lrf"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.425381 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451002 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451098 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451161 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451218 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.451976 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.452494 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.460186 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.502441 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.504154 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.545149 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.572990 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") pod \"neutron-963f-account-create-update-9hnkv\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.588234 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") pod \"root-account-create-update-snf5b\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.593211 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.652767 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.655929 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.656102 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.699936 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4a8d-account-create-update-gxhxz"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.700481 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.709329 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.720033 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.743608 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.743905 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" containerID="cri-o://8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" gracePeriod=30 Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.746355 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="openstack-network-exporter" containerID="cri-o://6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" gracePeriod=30 Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.757907 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.758053 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.759217 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.759293 4816 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.759336 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config podName:91cdfd54-2ee7-490e-bf3f-563406e59cda nodeName:}" failed. No retries permitted until 2026-03-11 12:22:19.759322972 +0000 UTC m=+1426.350586939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config") pod "ovn-controller-metrics-r8xbm" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda") : configmap "ovncontroller-metrics-config" not found Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.781310 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:22:18 crc kubenswrapper[4816]: I0311 12:22:18.795376 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7l6hp"] Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.883731 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:18 crc kubenswrapper[4816]: E0311 12:22:18.883835 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:19.383803276 +0000 UTC m=+1425.975067243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.006379 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") pod \"barbican-4a8d-account-create-update-2lrkx\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.012336 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.153355 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fjmnw"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.178595 4816 generic.go:334] "Generic (PLEG): container finished" podID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerID="6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" exitCode=2 Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.178657 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerDied","Data":"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7"} Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.217361 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.240866 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c8d3-account-create-update-85zqd"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.252441 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.298996 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:22:19 crc kubenswrapper[4816]: E0311 12:22:19.417684 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:19 crc kubenswrapper[4816]: E0311 12:22:19.417765 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:20.41774754 +0000 UTC m=+1427.009011507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.424174 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.490382 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.491222 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-r8xbm" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerName="openstack-network-exporter" containerID="cri-o://be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8" gracePeriod=30 Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.599870 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.709899 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.758521 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4b4ms"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.775981 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-91ce-account-create-update-n8mz8"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.788931 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.804287 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-tdv64"] Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.821162 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:22:19 crc kubenswrapper[4816]: E0311 12:22:19.843476 4816 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Mar 11 12:22:19 crc kubenswrapper[4816]: E0311 12:22:19.843560 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config podName:91cdfd54-2ee7-490e-bf3f-563406e59cda nodeName:}" failed. No retries permitted until 2026-03-11 12:22:21.843539828 +0000 UTC m=+1428.434803795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config") pod "ovn-controller-metrics-r8xbm" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda") : configmap "ovncontroller-metrics-config" not found Mar 11 12:22:19 crc kubenswrapper[4816]: I0311 12:22:19.927581 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-53ba-account-create-update-2vf2k"] Mar 11 12:22:20 crc kubenswrapper[4816]: W0311 12:22:20.050824 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1dd25da_51d6_45f0_b70c_f1baa17d2da3.slice/crio-1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5 WatchSource:0}: Error finding container 1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5: Status 404 returned error can't find the container with id 1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.067817 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.093080 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:20 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: if [ -n "cinder" ]; then Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="cinder" Mar 11 12:22:20 crc kubenswrapper[4816]: else Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:20 crc kubenswrapper[4816]: fi Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:20 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:20 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:20 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:20 crc kubenswrapper[4816]: # support updates Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.096389 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-4bcf-account-create-update-nv5hk" podUID="b1dd25da-51d6-45f0-b70c-f1baa17d2da3" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.100636 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rjxsf"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.132365 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.133214 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="openstack-network-exporter" containerID="cri-o://4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db" gracePeriod=300 Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.285066 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:20 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: if [ -n "cinder" ]; then Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="cinder" Mar 11 12:22:20 crc kubenswrapper[4816]: else Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:20 crc kubenswrapper[4816]: fi Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:20 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:20 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:20 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:20 crc kubenswrapper[4816]: # support updates Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.285515 4816 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-84rn8" message=< Mar 11 12:22:20 crc kubenswrapper[4816]: Exiting ovn-controller (1) [ OK ] Mar 11 12:22:20 crc kubenswrapper[4816]: > Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.285563 4816 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-84rn8" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" containerID="cri-o://b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.285616 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-84rn8" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" containerID="cri-o://b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.286517 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-4bcf-account-create-update-nv5hk" podUID="b1dd25da-51d6-45f0-b70c-f1baa17d2da3" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.303746 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r8xbm_91cdfd54-2ee7-490e-bf3f-563406e59cda/openstack-network-exporter/0.log" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.303809 4816 generic.go:334] "Generic (PLEG): container finished" podID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerID="be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8" exitCode=2 Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.474838 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.474928 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:22.474902089 +0000 UTC m=+1429.066166056 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.521624 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="ovsdbserver-sb" containerID="cri-o://ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889" gracePeriod=300 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.615312 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2772ef82-fe14-4f4d-8349-8ee515e39979" path="/var/lib/kubelet/pods/2772ef82-fe14-4f4d-8349-8ee515e39979/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.616559 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a1317c-41a6-4589-949b-e422d7fe8837" path="/var/lib/kubelet/pods/27a1317c-41a6-4589-949b-e422d7fe8837/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.650350 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288dd774-6e04-45d2-b786-c7f2be7fbeae" path="/var/lib/kubelet/pods/288dd774-6e04-45d2-b786-c7f2be7fbeae/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.654523 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35fe8af0-2f02-4d81-ae03-9d399900494c" path="/var/lib/kubelet/pods/35fe8af0-2f02-4d81-ae03-9d399900494c/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.655692 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae20611-891b-49ee-b5b8-0dad8af80906" path="/var/lib/kubelet/pods/3ae20611-891b-49ee-b5b8-0dad8af80906/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.670856 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c643aa04-ce8d-4c3b-befc-ecdf63e35de8" path="/var/lib/kubelet/pods/c643aa04-ce8d-4c3b-befc-ecdf63e35de8/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.673918 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82cb42a-5dbf-43d1-a71c-18b3e6d252d6" path="/var/lib/kubelet/pods/e82cb42a-5dbf-43d1-a71c-18b3e6d252d6/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.674753 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee11077d-39aa-44c4-9cf3-a8a80647bc50" path="/var/lib/kubelet/pods/ee11077d-39aa-44c4-9cf3-a8a80647bc50/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.675470 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92c8acc-1a4a-4f28-a123-2f5b8b6905af" path="/var/lib/kubelet/pods/f92c8acc-1a4a-4f28-a123-2f5b8b6905af/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.688540 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee1eb20-6fbe-4e59-a434-54c2e8a6165d" path="/var/lib/kubelet/pods/fee1eb20-6fbe-4e59-a434-54c2e8a6165d/volumes" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689327 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689358 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-nv5hk" event={"ID":"b1dd25da-51d6-45f0-b70c-f1baa17d2da3","Type":"ContainerStarted","Data":"1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5"} Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689825 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8xbm" event={"ID":"91cdfd54-2ee7-490e-bf3f-563406e59cda","Type":"ContainerDied","Data":"be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8"} Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689848 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9nggr"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689929 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689952 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689967 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689978 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.689990 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-n98v5"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690001 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690017 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-377b-account-create-update-gb4b2"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690029 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690046 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690059 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690105 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690120 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690130 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wsfdf"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690140 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690149 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xm9d9"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690160 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690170 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690180 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qt9tz"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690204 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690468 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c5b6658f-tdgsh" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-httpd" containerID="cri-o://ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.690924 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="cinder-scheduler" containerID="cri-o://b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.691114 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="dnsmasq-dns" containerID="cri-o://373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835" gracePeriod=10 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.692315 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api-log" containerID="cri-o://691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.692469 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5ffd6fb588-7hftz" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-log" containerID="cri-o://3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.692879 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-server" containerID="cri-o://e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693285 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c5b6658f-tdgsh" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-server" containerID="cri-o://526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693358 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="openstack-network-exporter" containerID="cri-o://5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b" gracePeriod=300 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693415 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="probe" containerID="cri-o://4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693510 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-expirer" containerID="cri-o://cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693575 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-auditor" containerID="cri-o://a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693641 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-updater" containerID="cri-o://1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693656 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-auditor" containerID="cri-o://25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693669 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-replicator" containerID="cri-o://3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693682 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-server" containerID="cri-o://fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693694 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-updater" containerID="cri-o://bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693710 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-reaper" containerID="cri-o://712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693723 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-replicator" containerID="cri-o://424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693737 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-server" containerID="cri-o://9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693800 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6867c6dbc5-lzgfd" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-httpd" containerID="cri-o://d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693812 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6867c6dbc5-lzgfd" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-api" containerID="cri-o://fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693833 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" containerID="cri-o://c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693854 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="swift-recon-cron" containerID="cri-o://4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693887 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5ffd6fb588-7hftz" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-api" containerID="cri-o://6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693907 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-replicator" containerID="cri-o://d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.693934 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-auditor" containerID="cri-o://acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.694208 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="rsync" containerID="cri-o://68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.702450 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:20 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: if [ -n "" ]; then Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="" Mar 11 12:22:20 crc kubenswrapper[4816]: else Mar 11 12:22:20 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:20 crc kubenswrapper[4816]: fi Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:20 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:20 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:20 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:20 crc kubenswrapper[4816]: # support updates Mar 11 12:22:20 crc kubenswrapper[4816]: Mar 11 12:22:20 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.704540 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-snf5b" podUID="3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.709220 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.739240 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-l5lds"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.749694 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.750016 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-log" containerID="cri-o://c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.750231 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-httpd" containerID="cri-o://8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.787508 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.796266 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.796617 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-855897fd55-t7sfb" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker-log" containerID="cri-o://f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.797174 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-855897fd55-t7sfb" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker" containerID="cri-o://4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.802735 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" containerID="cri-o://9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" gracePeriod=29 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.806986 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.807372 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-log" containerID="cri-o://c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.807959 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-httpd" containerID="cri-o://9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.824043 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.824515 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" containerID="cri-o://494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.824977 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" containerID="cri-o://05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.845951 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.846330 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64b59f8d4-2vxd9" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" containerID="cri-o://8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.846841 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64b59f8d4-2vxd9" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" containerID="cri-o://5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.857488 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="ovsdbserver-nb" containerID="cri-o://e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6" gracePeriod=300 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.881122 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.911887 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.912222 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener-log" containerID="cri-o://5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.912794 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener" containerID="cri-o://a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.925521 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.933365 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" containerID="cri-o://9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.934292 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" containerID="cri-o://f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d" gracePeriod=30 Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.949267 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.969579 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:20 crc kubenswrapper[4816]: E0311 12:22:20.969651 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:21.469631796 +0000 UTC m=+1428.060895753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:20 crc kubenswrapper[4816]: I0311 12:22:20.987819 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cnlpc"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.017738 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.035369 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.083678 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.102318 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4z7mr"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.134370 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zv62x"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.176613 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" containerID="cri-o://e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" gracePeriod=29 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.176706 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.199636 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.212872 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r8xbm_91cdfd54-2ee7-490e-bf3f-563406e59cda/openstack-network-exporter/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.212964 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.219629 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-txccq"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.240239 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.266559 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.266825 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287652 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287707 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287787 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287855 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.287918 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.288106 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") pod \"91cdfd54-2ee7-490e-bf3f-563406e59cda\" (UID: \"91cdfd54-2ee7-490e-bf3f-563406e59cda\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.289421 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.290037 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config" (OuterVolumeSpecName: "config") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.290806 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.293997 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91cdfd54-2ee7-490e-bf3f-563406e59cda-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.294021 4816 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.294034 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91cdfd54-2ee7-490e-bf3f-563406e59cda-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.319807 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.336089 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.351984 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4kpfn"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.364877 4816 generic.go:334] "Generic (PLEG): container finished" podID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerID="c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.365059 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerDied","Data":"c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.368774 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.373875 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x" (OuterVolumeSpecName: "kube-api-access-7lq9x") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "kube-api-access-7lq9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.381574 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.388918 4816 generic.go:334] "Generic (PLEG): container finished" podID="7795071e-2de0-43cb-b225-cfed54570d94" containerID="8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.389039 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerDied","Data":"8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.390489 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3c3c-account-create-update-2whdq"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.396776 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lq9x\" (UniqueName: \"kubernetes.io/projected/91cdfd54-2ee7-490e-bf3f-563406e59cda-kube-api-access-7lq9x\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.398872 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-85nd9"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.400448 4816 generic.go:334] "Generic (PLEG): container finished" podID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerID="691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.400542 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerDied","Data":"691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.406438 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.406716 4816 generic.go:334] "Generic (PLEG): container finished" podID="32a279c7-00a8-4e98-8356-91e219416a22" containerID="373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835" exitCode=0 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.406910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerDied","Data":"373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.428435 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" containerID="cri-o://0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7" gracePeriod=604800 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.428882 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.450805 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16e7d30-3235-44f2-81b4-c0c828071bbb/ovsdbserver-nb/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.450884 4816 generic.go:334] "Generic (PLEG): container finished" podID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerID="5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b" exitCode=2 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.450918 4816 generic.go:334] "Generic (PLEG): container finished" podID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerID="e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.451049 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerDied","Data":"5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.451092 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerDied","Data":"e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.465334 4816 generic.go:334] "Generic (PLEG): container finished" podID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerID="c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.465486 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerDied","Data":"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.538445 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.538609 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.538677 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:22.538656857 +0000 UTC m=+1429.129920824 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.634281 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-snf5b" event={"ID":"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb","Type":"ContainerStarted","Data":"37dc46cbca9b814e026266eb10b0888ee0b98d2b5a77de8a934c3e1d5742969a"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.652600 4816 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-snf5b" secret="" err="secret \"galera-openstack-cell1-dockercfg-n5gxr\" not found" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.654392 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:21 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: if [ -n "neutron" ]; then Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="neutron" Mar 11 12:22:21 crc kubenswrapper[4816]: else Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:21 crc kubenswrapper[4816]: fi Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:21 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:21 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:21 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:21 crc kubenswrapper[4816]: # support updates Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.656346 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-963f-account-create-update-9hnkv" podUID="5e637fcd-e45c-479c-856d-086d642af3bb" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.663301 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.695026 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.706269 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.707175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerDied","Data":"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85"} Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.706855 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:21 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: if [ -n "" ]; then Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="" Mar 11 12:22:21 crc kubenswrapper[4816]: else Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:21 crc kubenswrapper[4816]: fi Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:21 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:21 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:21 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:21 crc kubenswrapper[4816]: # support updates Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.706663 4816 generic.go:334] "Generic (PLEG): container finished" podID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerID="5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.707576 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" containerID="cri-o://adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.708321 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-snf5b" podUID="3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.729338 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdblc"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.730167 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "91cdfd54-2ee7-490e-bf3f-563406e59cda" (UID: "91cdfd54-2ee7-490e-bf3f-563406e59cda"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.737099 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r8xbm_91cdfd54-2ee7-490e-bf3f-563406e59cda/openstack-network-exporter/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.746585 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.754238 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8xbm" event={"ID":"91cdfd54-2ee7-490e-bf3f-563406e59cda","Type":"ContainerDied","Data":"f248cb3d03b08e499e3214d91a64d19cf5108c7a76d1c30f73bf2b55bdc66e0a"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.754328 4816 scope.go:117] "RemoveContainer" containerID="be5c0e05e1987846058e7b0cb0a3139e1568599a10f5067e16f3de74b6995fb8" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.754553 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8xbm" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.759643 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91cdfd54-2ee7-490e-bf3f-563406e59cda-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.759782 4816 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.761498 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:21 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: if [ -n "barbican" ]; then Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="barbican" Mar 11 12:22:21 crc kubenswrapper[4816]: else Mar 11 12:22:21 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:21 crc kubenswrapper[4816]: fi Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:21 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:21 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:21 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:21 crc kubenswrapper[4816]: # support updates Mar 11 12:22:21 crc kubenswrapper[4816]: Mar 11 12:22:21 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.762821 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-4a8d-account-create-update-2lrkx" podUID="c47c9b57-0735-415f-a1a1-4b3096e3fbcf" Mar 11 12:22:21 crc kubenswrapper[4816]: E0311 12:22:21.763016 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts podName:3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb nodeName:}" failed. No retries permitted until 2026-03-11 12:22:22.262983078 +0000 UTC m=+1428.854247055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts") pod "root-account-create-update-snf5b" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb") : configmap "openstack-cell1-scripts" not found Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.763497 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.765972 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.776335 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r2t5s"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.788496 4816 generic.go:334] "Generic (PLEG): container finished" podID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerID="494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.788658 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerDied","Data":"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.804644 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.815606 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.815848 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" containerID="cri-o://60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.819355 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.831494 4816 generic.go:334] "Generic (PLEG): container finished" podID="2de58390-335b-40cc-8461-d931d3b22e41" containerID="b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17" exitCode=0 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.831584 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8" event={"ID":"2de58390-335b-40cc-8461-d931d3b22e41","Type":"ContainerDied","Data":"b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.834567 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862005 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe419fb1-1901-4fd4-9d9c-8884651e3ad9/ovsdbserver-sb/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862065 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerID="4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db" exitCode=2 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862091 4816 generic.go:334] "Generic (PLEG): container finished" podID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerID="ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerDied","Data":"4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.862548 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerDied","Data":"ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.876969 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="galera" containerID="cri-o://08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054" gracePeriod=30 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.878321 4816 generic.go:334] "Generic (PLEG): container finished" podID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerID="f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.878491 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerDied","Data":"f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.901860 4816 generic.go:334] "Generic (PLEG): container finished" podID="502b3843-8246-4715-9735-dfc0336caacb" containerID="fd6533a10f6d22b4d1d7a2a73ad8cc4591438b77aefeced48dbf3b4526cf28f0" exitCode=137 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.905576 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.910201 4816 generic.go:334] "Generic (PLEG): container finished" podID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerID="3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f" exitCode=143 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.910332 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerDied","Data":"3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.925836 4816 generic.go:334] "Generic (PLEG): container finished" podID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerID="d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" exitCode=0 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.925946 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerDied","Data":"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.938943 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.949402 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerID="ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" exitCode=0 Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.949478 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.949507 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerDied","Data":"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff"} Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.950391 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe419fb1-1901-4fd4-9d9c-8884651e3ad9/ovsdbserver-sb/0.log" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.950452 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965260 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965339 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965372 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965433 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965552 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965692 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.965735 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") pod \"2de58390-335b-40cc-8461-d931d3b22e41\" (UID: \"2de58390-335b-40cc-8461-d931d3b22e41\") " Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.966369 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.968539 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.968572 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.968715 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run" (OuterVolumeSpecName: "var-run") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.969379 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts" (OuterVolumeSpecName: "scripts") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.972535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw" (OuterVolumeSpecName: "kube-api-access-bpfnw") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "kube-api-access-bpfnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:21 crc kubenswrapper[4816]: I0311 12:22:21.990899 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-r8xbm"] Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002498 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002602 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002617 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002628 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002641 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002653 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002662 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002671 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002680 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002689 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002698 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" exitCode=0 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002810 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002855 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002873 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002898 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002913 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002928 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002956 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002968 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002980 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.002993 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.003008 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d"} Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.007474 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.013311 4816 generic.go:334] "Generic (PLEG): container finished" podID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerID="9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de" exitCode=143 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.015036 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerDied","Data":"9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de"} Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.026374 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:22 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: if [ -n "cinder" ]; then Mar 11 12:22:22 crc kubenswrapper[4816]: GRANT_DATABASE="cinder" Mar 11 12:22:22 crc kubenswrapper[4816]: else Mar 11 12:22:22 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:22 crc kubenswrapper[4816]: fi Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:22 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:22 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:22 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:22 crc kubenswrapper[4816]: # support updates Mar 11 12:22:22 crc kubenswrapper[4816]: Mar 11 12:22:22 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.027582 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-4bcf-account-create-update-nv5hk" podUID="b1dd25da-51d6-45f0-b70c-f1baa17d2da3" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.059446 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "2de58390-335b-40cc-8461-d931d3b22e41" (UID: "2de58390-335b-40cc-8461-d931d3b22e41"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067188 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067244 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067581 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067626 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067693 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067787 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067818 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067874 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067927 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") pod \"502b3843-8246-4715-9735-dfc0336caacb\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.067995 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") pod \"502b3843-8246-4715-9735-dfc0336caacb\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068047 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") pod \"502b3843-8246-4715-9735-dfc0336caacb\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068078 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") pod \"502b3843-8246-4715-9735-dfc0336caacb\" (UID: \"502b3843-8246-4715-9735-dfc0336caacb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068167 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068245 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068314 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068390 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") pod \"32a279c7-00a8-4e98-8356-91e219416a22\" (UID: \"32a279c7-00a8-4e98-8356-91e219416a22\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068437 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\" (UID: \"fe419fb1-1901-4fd4-9d9c-8884651e3ad9\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.068812 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config" (OuterVolumeSpecName: "config") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069083 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2de58390-335b-40cc-8461-d931d3b22e41-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069161 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069220 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069399 4816 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069562 4816 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069622 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de58390-335b-40cc-8461-d931d3b22e41-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069688 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpfnw\" (UniqueName: \"kubernetes.io/projected/2de58390-335b-40cc-8461-d931d3b22e41-kube-api-access-bpfnw\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069744 4816 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2de58390-335b-40cc-8461-d931d3b22e41-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.069925 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.074471 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts" (OuterVolumeSpecName: "scripts") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.086240 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.086335 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2" (OuterVolumeSpecName: "kube-api-access-xhzf2") pod "502b3843-8246-4715-9735-dfc0336caacb" (UID: "502b3843-8246-4715-9735-dfc0336caacb"). InnerVolumeSpecName "kube-api-access-xhzf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.086411 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6" (OuterVolumeSpecName: "kube-api-access-6wmq6") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "kube-api-access-6wmq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.091010 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" containerID="cri-o://18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" gracePeriod=604800 Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.091433 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z" (OuterVolumeSpecName: "kube-api-access-jfg5z") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "kube-api-access-jfg5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.124446 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.146814 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec4faaf-e219-4b01-b3b9-0d6757a38154" path="/var/lib/kubelet/pods/1ec4faaf-e219-4b01-b3b9-0d6757a38154/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.147902 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fadc66-c846-46c0-a002-efeb7656f2b8" path="/var/lib/kubelet/pods/36fadc66-c846-46c0-a002-efeb7656f2b8/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.148932 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f519dc2-e88b-4e4b-9637-c3e172b81bfa" path="/var/lib/kubelet/pods/3f519dc2-e88b-4e4b-9637-c3e172b81bfa/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.149910 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403fec7f-c194-4bdd-a620-34aefb5d677c" path="/var/lib/kubelet/pods/403fec7f-c194-4bdd-a620-34aefb5d677c/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.153563 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6268fe92-5c93-43c7-95bc-f30befda5d65" path="/var/lib/kubelet/pods/6268fe92-5c93-43c7-95bc-f30befda5d65/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.154639 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632c5d32-5370-401a-8202-58e0ec70f357" path="/var/lib/kubelet/pods/632c5d32-5370-401a-8202-58e0ec70f357/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.155518 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66951176-170f-4d49-9a92-aeeb66f4a79c" path="/var/lib/kubelet/pods/66951176-170f-4d49-9a92-aeeb66f4a79c/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.157490 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c9952da-6281-45f2-8b45-30caa27b8d39" path="/var/lib/kubelet/pods/7c9952da-6281-45f2-8b45-30caa27b8d39/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.158322 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fca72cd-9caa-4029-8c20-1623a315702d" path="/var/lib/kubelet/pods/7fca72cd-9caa-4029-8c20-1623a315702d/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.159503 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d3e7fa1-3f66-495b-be44-cf97eec043c1" path="/var/lib/kubelet/pods/8d3e7fa1-3f66-495b-be44-cf97eec043c1/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.160979 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" path="/var/lib/kubelet/pods/91cdfd54-2ee7-490e-bf3f-563406e59cda/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.161859 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd32333-bdaa-461b-ac10-324291d1e5d3" path="/var/lib/kubelet/pods/9fd32333-bdaa-461b-ac10-324291d1e5d3/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.162652 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e0ff63-3d12-4174-9341-ceb21109e000" path="/var/lib/kubelet/pods/a0e0ff63-3d12-4174-9341-ceb21109e000/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.163990 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25abdc0-8516-4747-a589-78db9bc64ca3" path="/var/lib/kubelet/pods/a25abdc0-8516-4747-a589-78db9bc64ca3/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.164791 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac" path="/var/lib/kubelet/pods/b565c7f9-4cb9-43a6-9b2c-0f5ebf1930ac/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.167611 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6745bae-b403-4a86-9148-8baecc00f8b1" path="/var/lib/kubelet/pods/b6745bae-b403-4a86-9148-8baecc00f8b1/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.168742 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac" path="/var/lib/kubelet/pods/fe71e5ba-d9ea-4b01-b2fe-3401268ae2ac/volumes" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171458 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171501 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhzf2\" (UniqueName: \"kubernetes.io/projected/502b3843-8246-4715-9735-dfc0336caacb-kube-api-access-xhzf2\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171512 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wmq6\" (UniqueName: \"kubernetes.io/projected/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-kube-api-access-6wmq6\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171521 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171532 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfg5z\" (UniqueName: \"kubernetes.io/projected/32a279c7-00a8-4e98-8356-91e219416a22-kube-api-access-jfg5z\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171564 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.171574 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.192348 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.274343 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.274463 4816 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.274728 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts podName:3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb nodeName:}" failed. No retries permitted until 2026-03-11 12:22:23.274706514 +0000 UTC m=+1429.865970481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts") pod "root-account-create-update-snf5b" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb") : configmap "openstack-cell1-scripts" not found Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.385687 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "502b3843-8246-4715-9735-dfc0336caacb" (UID: "502b3843-8246-4715-9735-dfc0336caacb"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.391602 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "502b3843-8246-4715-9735-dfc0336caacb" (UID: "502b3843-8246-4715-9735-dfc0336caacb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.444703 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config" (OuterVolumeSpecName: "config") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.475805 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.478958 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.488258 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.488294 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.488309 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.488320 4816 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/502b3843-8246-4715-9735-dfc0336caacb-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.491372 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.495182 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:26.495135074 +0000 UTC m=+1433.086399041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.495215 4816 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.552939 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.599473 4816 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.599558 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.599612 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:24.599594314 +0000 UTC m=+1431.190858281 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.626657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32a279c7-00a8-4e98-8356-91e219416a22" (UID: "32a279c7-00a8-4e98-8356-91e219416a22"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.695160 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "502b3843-8246-4715-9735-dfc0336caacb" (UID: "502b3843-8246-4715-9735-dfc0336caacb"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.704153 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a279c7-00a8-4e98-8356-91e219416a22-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.704189 4816 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/502b3843-8246-4715-9735-dfc0336caacb-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.711752 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.779578 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.788214 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.788585 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "fe419fb1-1901-4fd4-9d9c-8884651e3ad9" (UID: "fe419fb1-1901-4fd4-9d9c-8884651e3ad9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.800019 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:22 crc kubenswrapper[4816]: E0311 12:22:22.800162 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.812848 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16e7d30-3235-44f2-81b4-c0c828071bbb/ovsdbserver-nb/0.log" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.812942 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.818688 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.823463 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.823518 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe419fb1-1901-4fd4-9d9c-8884651e3ad9-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925492 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925557 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925636 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925740 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925832 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925890 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925918 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.925977 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926002 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926130 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926178 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926205 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926260 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926346 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.926412 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") pod \"e16e7d30-3235-44f2-81b4-c0c828071bbb\" (UID: \"e16e7d30-3235-44f2-81b4-c0c828071bbb\") " Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.928612 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.928776 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config" (OuterVolumeSpecName: "config") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.929770 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts" (OuterVolumeSpecName: "scripts") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.933182 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.933782 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.934513 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.937596 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh" (OuterVolumeSpecName: "kube-api-access-ddtnh") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "kube-api-access-ddtnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.945673 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr" (OuterVolumeSpecName: "kube-api-access-r4nnr") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "kube-api-access-r4nnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.952636 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.980368 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:22:22 crc kubenswrapper[4816]: I0311 12:22:22.984905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030464 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030501 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030512 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030520 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddtnh\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-kube-api-access-ddtnh\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030553 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030562 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030572 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e6d90d2-e7e3-4245-b3a6-042621e01a67-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030579 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4nnr\" (UniqueName: \"kubernetes.io/projected/e16e7d30-3235-44f2-81b4-c0c828071bbb-kube-api-access-r4nnr\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030587 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16e7d30-3235-44f2-81b4-c0c828071bbb-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.030597 4816 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3e6d90d2-e7e3-4245-b3a6-042621e01a67-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.032618 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.061522 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.080567 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.082980 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084020 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-httpd" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084041 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-httpd" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084064 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084071 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084081 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084089 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084101 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084108 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084123 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="init" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084129 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="init" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084149 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084160 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084175 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="dnsmasq-dns" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084183 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="dnsmasq-dns" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084198 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="ovsdbserver-nb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084204 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="ovsdbserver-nb" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084212 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084219 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084232 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-server" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084238 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-server" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.084271 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="ovsdbserver-sb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.084278 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="ovsdbserver-sb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.085296 4816 generic.go:334] "Generic (PLEG): container finished" podID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerID="526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.085456 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c5b6658f-tdgsh" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088569 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088887 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088909 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de58390-335b-40cc-8461-d931d3b22e41" containerName="ovn-controller" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088931 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088945 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-server" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088955 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" containerName="proxy-httpd" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088964 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" containerName="ovsdbserver-sb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088981 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" containerName="ovsdbserver-nb" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.088996 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="91cdfd54-2ee7-490e-bf3f-563406e59cda" containerName="openstack-network-exporter" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.089012 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a279c7-00a8-4e98-8356-91e219416a22" containerName="dnsmasq-dns" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.089212 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84rn8" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerDied","Data":"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c5b6658f-tdgsh" event={"ID":"3e6d90d2-e7e3-4245-b3a6-042621e01a67","Type":"ContainerDied","Data":"78085f7a145fe8f236757523ada7ae443e7b3ab85638d5063fc54c7855365882"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090580 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84rn8" event={"ID":"2de58390-335b-40cc-8461-d931d3b22e41","Type":"ContainerDied","Data":"90ffa1dacc5321713c5d44a9d616add617a25ab1efffcadfb14af28f07cc7bbd"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090602 4816 scope.go:117] "RemoveContainer" containerID="526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.090692 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.093022 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.109964 4816 generic.go:334] "Generic (PLEG): container finished" podID="fd796be0-d1ac-47be-8162-3b1c42febc0a" containerID="de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.111074 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.114104 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fd796be0-d1ac-47be-8162-3b1c42febc0a","Type":"ContainerDied","Data":"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.114220 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fd796be0-d1ac-47be-8162-3b1c42febc0a","Type":"ContainerDied","Data":"73799c30d5d3ab5fe26ad3cf5939299dea4d34493e455f7bcdac484f34941957"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.122913 4816 scope.go:117] "RemoveContainer" containerID="ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.123734 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-2lrkx" event={"ID":"c47c9b57-0735-415f-a1a1-4b3096e3fbcf","Type":"ContainerStarted","Data":"907e6d1395bfe6aa206c07c8f0ecff9b2205c70baa8db02eecf5138662995725"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.127359 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.137324 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.138298 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.138356 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.138499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.138552 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.139591 4816 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.139634 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.139644 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.140695 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-9hnkv" event={"ID":"5e637fcd-e45c-479c-856d-086d642af3bb","Type":"ContainerStarted","Data":"4f023b2d8c3517ea66e0705887fb61a09310d77cc0b2edae0368635152923da3"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.166504 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68" (OuterVolumeSpecName: "kube-api-access-jkd68") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "kube-api-access-jkd68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.173971 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191259 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191304 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191318 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191428 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191476 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.191504 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.194820 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data" (OuterVolumeSpecName: "config-data") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.211414 4816 scope.go:117] "RemoveContainer" containerID="526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.213065 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000\": container with ID starting with 526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000 not found: ID does not exist" containerID="526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.213106 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000"} err="failed to get container status \"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000\": rpc error: code = NotFound desc = could not find container \"526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000\": container with ID starting with 526e39d56a3ef06aabde599a52928183d785fb0defd865027d97973b83934000 not found: ID does not exist" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.213125 4816 scope.go:117] "RemoveContainer" containerID="ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.220355 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff\": container with ID starting with ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff not found: ID does not exist" containerID="ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.220387 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff"} err="failed to get container status \"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff\": rpc error: code = NotFound desc = could not find container \"ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff\": container with ID starting with ea5c353eabccdde33e08d88c70444e4944a8f2019a7db074ae615e6ef96ee3ff not found: ID does not exist" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.220411 4816 scope.go:117] "RemoveContainer" containerID="b67798b7f6eede8770ea6cbb3808f928e4bdbe9cdbf08abe0db324318159dd17" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.224558 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.224665 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-7h7r8" event={"ID":"32a279c7-00a8-4e98-8356-91e219416a22","Type":"ContainerDied","Data":"88f0e5edf59a2c15eb9814f01d499e770f690a88f8bf62d0decdbb14e939c9e6"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.241474 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.246813 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.247190 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data" (OuterVolumeSpecName: "config-data") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.250023 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") pod \"fd796be0-d1ac-47be-8162-3b1c42febc0a\" (UID: \"fd796be0-d1ac-47be-8162-3b1c42febc0a\") " Mar 11 12:22:23 crc kubenswrapper[4816]: W0311 12:22:23.250176 4816 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fd796be0-d1ac-47be-8162-3b1c42febc0a/volumes/kubernetes.io~secret/config-data Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.250201 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data" (OuterVolumeSpecName: "config-data") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.250556 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") pod \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\" (UID: \"3e6d90d2-e7e3-4245-b3a6-042621e01a67\") " Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.252134 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: W0311 12:22:23.252287 4816 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3e6d90d2-e7e3-4245-b3a6-042621e01a67/volumes/kubernetes.io~secret/internal-tls-certs Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.252307 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e6d90d2-e7e3-4245-b3a6-042621e01a67" (UID: "3e6d90d2-e7e3-4245-b3a6-042621e01a67"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.269430 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.269933 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.270794 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkd68\" (UniqueName: \"kubernetes.io/projected/fd796be0-d1ac-47be-8162-3b1c42febc0a-kube-api-access-jkd68\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.270894 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.270972 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6d90d2-e7e3-4245-b3a6-042621e01a67-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.271047 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.274951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.290587 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-84rn8"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.301169 4816 scope.go:117] "RemoveContainer" containerID="de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.301500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.304511 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "fd796be0-d1ac-47be-8162-3b1c42febc0a" (UID: "fd796be0-d1ac-47be-8162-3b1c42febc0a"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.331787 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16e7d30-3235-44f2-81b4-c0c828071bbb/ovsdbserver-nb/0.log" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.331872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16e7d30-3235-44f2-81b4-c0c828071bbb","Type":"ContainerDied","Data":"33dcb516fa17b7c432ef1e2b1650ba4d2e9f946dd76257f934af302a386a7dbf"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.331994 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.345856 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe419fb1-1901-4fd4-9d9c-8884651e3ad9/ovsdbserver-sb/0.log" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.345942 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe419fb1-1901-4fd4-9d9c-8884651e3ad9","Type":"ContainerDied","Data":"856ecaff8a78617160b7f62ce0d1169e3c52ef425eb093d777cccb4f585957a7"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.346056 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.361387 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e16e7d30-3235-44f2-81b4-c0c828071bbb" (UID: "e16e7d30-3235-44f2-81b4-c0c828071bbb"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.365374 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.370115 4816 scope.go:117] "RemoveContainer" containerID="de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.370374 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-7h7r8"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.370562 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e\": container with ID starting with de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e not found: ID does not exist" containerID="de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.370604 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e"} err="failed to get container status \"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e\": rpc error: code = NotFound desc = could not find container \"de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e\": container with ID starting with de21378c0051d3ac4940fe242c0e851f880805f3d01edc4d6ef2444f52ded95e not found: ID does not exist" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.370637 4816 scope.go:117] "RemoveContainer" containerID="373cac1249bba137b237fe973a3b7880bfcca6318c8db162f6ca4526fa918835" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373215 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373285 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373404 4816 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373418 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16e7d30-3235-44f2-81b4-c0c828071bbb-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373429 4816 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.373439 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd796be0-d1ac-47be-8162-3b1c42febc0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.374508 4816 generic.go:334] "Generic (PLEG): container finished" podID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.374608 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerDied","Data":"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c"} Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.374852 4816 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.374881 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.374901 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts podName:3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb nodeName:}" failed. No retries permitted until 2026-03-11 12:22:25.374887158 +0000 UTC m=+1431.966151125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts") pod "root-account-create-update-snf5b" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb") : configmap "openstack-cell1-scripts" not found Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.400362 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") pod \"root-account-create-update-bvvkj\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.401943 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.412454 4816 generic.go:334] "Generic (PLEG): container finished" podID="594ad696-b727-4153-979f-d32ccdc1fe83" containerID="4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" exitCode=0 Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.412843 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerDied","Data":"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8"} Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.420267 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.434837 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.442314 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.456602 4816 scope.go:117] "RemoveContainer" containerID="1876def9a0f72b0ad981ff600f29fd745c0daa03affcc6a0a2083718b834badc" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.456787 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.467473 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6c5b6658f-tdgsh"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.673656 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.675344 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.681579 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 11 12:22:23 crc kubenswrapper[4816]: E0311 12:22:23.681666 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.838601 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.871963 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.874778 4816 scope.go:117] "RemoveContainer" containerID="5e227ce28f5de77017097c97e0a28037dfd14090da88c0fa20d1f53e10f8268b" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.908753 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.951409 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.961001 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:23 crc kubenswrapper[4816]: I0311 12:22:23.965231 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.004630 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") pod \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.004915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") pod \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\" (UID: \"c47c9b57-0735-415f-a1a1-4b3096e3fbcf\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.006174 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c47c9b57-0735-415f-a1a1-4b3096e3fbcf" (UID: "c47c9b57-0735-415f-a1a1-4b3096e3fbcf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.008471 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.019559 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc" (OuterVolumeSpecName: "kube-api-access-8msfc") pod "c47c9b57-0735-415f-a1a1-4b3096e3fbcf" (UID: "c47c9b57-0735-415f-a1a1-4b3096e3fbcf"). InnerVolumeSpecName "kube-api-access-8msfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.058672 4816 scope.go:117] "RemoveContainer" containerID="e36d52352569b57940dd2cebcd565fb31e6c049d444d2da7c54f0fe9d882c7f6" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.110384 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") pod \"5e637fcd-e45c-479c-856d-086d642af3bb\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.110754 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") pod \"5e637fcd-e45c-479c-856d-086d642af3bb\" (UID: \"5e637fcd-e45c-479c-856d-086d642af3bb\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.111145 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e637fcd-e45c-479c-856d-086d642af3bb" (UID: "5e637fcd-e45c-479c-856d-086d642af3bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.113185 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8msfc\" (UniqueName: \"kubernetes.io/projected/c47c9b57-0735-415f-a1a1-4b3096e3fbcf-kube-api-access-8msfc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.113713 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e637fcd-e45c-479c-856d-086d642af3bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.125504 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x" (OuterVolumeSpecName: "kube-api-access-hn92x") pod "5e637fcd-e45c-479c-856d-086d642af3bb" (UID: "5e637fcd-e45c-479c-856d-086d642af3bb"). InnerVolumeSpecName "kube-api-access-hn92x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.144121 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de58390-335b-40cc-8461-d931d3b22e41" path="/var/lib/kubelet/pods/2de58390-335b-40cc-8461-d931d3b22e41/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.144963 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a279c7-00a8-4e98-8356-91e219416a22" path="/var/lib/kubelet/pods/32a279c7-00a8-4e98-8356-91e219416a22/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.145796 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6d90d2-e7e3-4245-b3a6-042621e01a67" path="/var/lib/kubelet/pods/3e6d90d2-e7e3-4245-b3a6-042621e01a67/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.146685 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502b3843-8246-4715-9735-dfc0336caacb" path="/var/lib/kubelet/pods/502b3843-8246-4715-9735-dfc0336caacb/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.147368 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16e7d30-3235-44f2-81b4-c0c828071bbb" path="/var/lib/kubelet/pods/e16e7d30-3235-44f2-81b4-c0c828071bbb/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.148542 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd796be0-d1ac-47be-8162-3b1c42febc0a" path="/var/lib/kubelet/pods/fd796be0-d1ac-47be-8162-3b1c42febc0a/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.149802 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe419fb1-1901-4fd4-9d9c-8884651e3ad9" path="/var/lib/kubelet/pods/fe419fb1-1901-4fd4-9d9c-8884651e3ad9/volumes" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.191890 4816 scope.go:117] "RemoveContainer" containerID="4c01622c11d3f3812a2eae31ec2decc063cf1fe9d275e29cfb942cdc480ba8db" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.210520 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.217860 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.217913 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.217934 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.217963 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.218017 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") pod \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\" (UID: \"ddd535a1-7585-4cb7-94ec-f4b98b10be4a\") " Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.218363 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn92x\" (UniqueName: \"kubernetes.io/projected/5e637fcd-e45c-479c-856d-086d642af3bb-kube-api-access-hn92x\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.227731 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.172:8776/healthcheck\": read tcp 10.217.0.2:41490->10.217.0.172:8776: read: connection reset by peer" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.241561 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.241966 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-central-agent" containerID="cri-o://84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.242212 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="proxy-httpd" containerID="cri-o://a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.242304 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="sg-core" containerID="cri-o://824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.242348 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-notification-agent" containerID="cri-o://a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.264017 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs" (OuterVolumeSpecName: "logs") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.316533 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.317002 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" containerID="cri-o://87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.317523 4816 scope.go:117] "RemoveContainer" containerID="ee8f2b910a2d52b32d76649fbccb57d3440b0a1d624504112ddbe71af6ca7889" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.318482 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9" (OuterVolumeSpecName: "kube-api-access-tl4t9") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "kube-api-access-tl4t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.319772 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl4t9\" (UniqueName: \"kubernetes.io/projected/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-kube-api-access-tl4t9\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.319791 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.322567 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.348180 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.350125 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.350810 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.350857 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.397666 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.410600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data" (OuterVolumeSpecName: "config-data") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.436908 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.436933 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.454848 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd535a1-7585-4cb7-94ec-f4b98b10be4a" (UID: "ddd535a1-7585-4cb7-94ec-f4b98b10be4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.487130 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.497802 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.550546 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd535a1-7585-4cb7-94ec-f4b98b10be4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.563004 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.563456 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="5030028c-f574-4334-a837-2430761524b4" containerName="memcached" containerID="cri-o://0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.600409 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.600896 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.624643 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9b21-account-create-update-r8vgg"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.635880 4816 generic.go:334] "Generic (PLEG): container finished" podID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerID="c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06" exitCode=0 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.635989 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerDied","Data":"c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.639872 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.641019 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener-log" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.641039 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener-log" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.641055 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.641062 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.641287 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener-log" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.641302 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerName="barbican-keystone-listener" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.642118 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.655963 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.656033 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:28.656011939 +0000 UTC m=+1435.247275906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.665589 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.666533 4816 generic.go:334] "Generic (PLEG): container finished" podID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" containerID="a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" exitCode=0 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.666646 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerDied","Data":"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.666686 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" event={"ID":"ddd535a1-7585-4cb7-94ec-f4b98b10be4a","Type":"ContainerDied","Data":"84c045541bc73afd53de86393645863c006080b89347feb36d269d40b0b6ac28"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.666763 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59b4f4d478-5b797" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.667842 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.690783 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.706427 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.722592 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-w8rqc"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.741483 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a8d-account-create-update-2lrkx" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.741493 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a8d-account-create-update-2lrkx" event={"ID":"c47c9b57-0735-415f-a1a1-4b3096e3fbcf","Type":"ContainerDied","Data":"907e6d1395bfe6aa206c07c8f0ecff9b2205c70baa8db02eecf5138662995725"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.759374 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.759469 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.761510 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kbmsk"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.782575 4816 generic.go:334] "Generic (PLEG): container finished" podID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerID="6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b" exitCode=0 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.782765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerDied","Data":"6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.793120 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.808661 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5d6ddcd789-qjf9c" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" containerName="keystone-api" containerID="cri-o://40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" gracePeriod=30 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.819759 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.841041 4816 generic.go:334] "Generic (PLEG): container finished" podID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerID="08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054" exitCode=0 Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.841156 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerDied","Data":"08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.844326 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-963f-account-create-update-9hnkv" event={"ID":"5e637fcd-e45c-479c-856d-086d642af3bb","Type":"ContainerDied","Data":"4f023b2d8c3517ea66e0705887fb61a09310d77cc0b2edae0368635152923da3"} Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.844372 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-963f-account-create-update-9hnkv" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.861751 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.866602 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.866759 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.867536 4816 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.867602 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:25.367577625 +0000 UTC m=+1431.958841592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : configmap "openstack-scripts" not found Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.913824 4816 projected.go:194] Error preparing data for projected volume kube-api-access-24z58 for pod openstack/keystone-9b21-account-create-update-cmcfl: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:24 crc kubenswrapper[4816]: E0311 12:22:24.913938 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58 podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:25.413912605 +0000 UTC m=+1432.005176572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-24z58" (UniqueName: "kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.928155 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.951261 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rmcqp"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.983384 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:24 crc kubenswrapper[4816]: I0311 12:22:24.993575 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.049547 4816 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 12:22:25 crc kubenswrapper[4816]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: if [ -n "" ]; then Mar 11 12:22:25 crc kubenswrapper[4816]: GRANT_DATABASE="" Mar 11 12:22:25 crc kubenswrapper[4816]: else Mar 11 12:22:25 crc kubenswrapper[4816]: GRANT_DATABASE="*" Mar 11 12:22:25 crc kubenswrapper[4816]: fi Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: # going for maximum compatibility here: Mar 11 12:22:25 crc kubenswrapper[4816]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 11 12:22:25 crc kubenswrapper[4816]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 11 12:22:25 crc kubenswrapper[4816]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 11 12:22:25 crc kubenswrapper[4816]: # support updates Mar 11 12:22:25 crc kubenswrapper[4816]: Mar 11 12:22:25 crc kubenswrapper[4816]: $MYSQL_CMD < logger="UnhandledError" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.053185 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-bvvkj" podUID="2d60557e-d939-46bf-8a60-641016b4d68d" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.133539 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": dial tcp 10.217.0.210:8775: connect: connection refused" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.133550 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": dial tcp 10.217.0.210:8775: connect: connection refused" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.224453 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" containerID="cri-o://c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" gracePeriod=30 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.382760 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.382996 4816 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.383079 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts podName:3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb nodeName:}" failed. No retries permitted until 2026-03-11 12:22:29.383057307 +0000 UTC m=+1435.974321274 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts") pod "root-account-create-update-snf5b" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb") : configmap "openstack-cell1-scripts" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.383486 4816 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.383516 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:26.38350569 +0000 UTC m=+1432.974769657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : configmap "openstack-scripts" not found Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.487920 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.492576 4816 projected.go:194] Error preparing data for projected volume kube-api-access-24z58 for pod openstack/keystone-9b21-account-create-update-cmcfl: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.492881 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58 podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:26.49285803 +0000 UTC m=+1433.084121997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-24z58" (UniqueName: "kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.611838 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-24z58 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-9b21-account-create-update-cmcfl" podUID="1da70aee-e1eb-4ad5-b0de-1e2f988dd729" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.627935 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.637652 4816 scope.go:117] "RemoveContainer" containerID="fd6533a10f6d22b4d1d7a2a73ad8cc4591438b77aefeced48dbf3b4526cf28f0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.643874 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.671539 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-59b4f4d478-5b797"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.682585 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.685677 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.691664 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") pod \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.691882 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") pod \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\" (UID: \"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.695868 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.696446 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.706918 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.717831 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw" (OuterVolumeSpecName: "kube-api-access-vf9zw") pod "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" (UID: "3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb"). InnerVolumeSpecName "kube-api-access-vf9zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.721349 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.722085 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.727997 4816 scope.go:117] "RemoveContainer" containerID="a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.734691 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-963f-account-create-update-9hnkv"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.764107 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.780195 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4a8d-account-create-update-2lrkx"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.793973 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794008 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794062 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794106 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794142 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794209 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794240 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794281 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") pod \"32dcc96b-186a-444d-bef3-4c5f117ee652\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794312 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794333 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") pod \"32dcc96b-186a-444d-bef3-4c5f117ee652\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794365 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794382 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794423 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794472 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794489 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794527 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") pod \"32dcc96b-186a-444d-bef3-4c5f117ee652\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794548 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") pod \"32dcc96b-186a-444d-bef3-4c5f117ee652\" (UID: \"32dcc96b-186a-444d-bef3-4c5f117ee652\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794601 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794624 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794652 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794680 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e95ddca0-76d0-4dce-9983-4b07655adc25\" (UID: \"e95ddca0-76d0-4dce-9983-4b07655adc25\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794703 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794725 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") pod \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794761 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794778 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") pod \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\" (UID: \"b1dd25da-51d6-45f0-b70c-f1baa17d2da3\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794803 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") pod \"9a22173f-147b-46ac-bb01-596fe9f12b10\" (UID: \"9a22173f-147b-46ac-bb01-596fe9f12b10\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.794841 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") pod \"7bd939d8-3b22-4496-acea-ac527f3e5149\" (UID: \"7bd939d8-3b22-4496-acea-ac527f3e5149\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.796218 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.796559 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.796960 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.796982 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf9zw\" (UniqueName: \"kubernetes.io/projected/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb-kube-api-access-vf9zw\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.797001 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.797011 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.797095 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1dd25da-51d6-45f0-b70c-f1baa17d2da3" (UID: "b1dd25da-51d6-45f0-b70c-f1baa17d2da3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.797535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs" (OuterVolumeSpecName: "logs") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.798193 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs" (OuterVolumeSpecName: "logs") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.805270 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts" (OuterVolumeSpecName: "scripts") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.805289 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts" (OuterVolumeSpecName: "scripts") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.805327 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.807897 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.808176 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.808521 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85" (OuterVolumeSpecName: "kube-api-access-c8b85") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "kube-api-access-c8b85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.810014 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.811681 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x" (OuterVolumeSpecName: "kube-api-access-grm8x") pod "b1dd25da-51d6-45f0-b70c-f1baa17d2da3" (UID: "b1dd25da-51d6-45f0-b70c-f1baa17d2da3"). InnerVolumeSpecName "kube-api-access-grm8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.813043 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.824099 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.824493 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.834951 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4" (OuterVolumeSpecName: "kube-api-access-6dlr4") pod "32dcc96b-186a-444d-bef3-4c5f117ee652" (UID: "32dcc96b-186a-444d-bef3-4c5f117ee652"). InnerVolumeSpecName "kube-api-access-6dlr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.837891 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb" (OuterVolumeSpecName: "kube-api-access-cnzpb") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "kube-api-access-cnzpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.846105 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4" (OuterVolumeSpecName: "kube-api-access-xbrs4") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "kube-api-access-xbrs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.850426 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32dcc96b-186a-444d-bef3-4c5f117ee652" (UID: "32dcc96b-186a-444d-bef3-4c5f117ee652"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.854452 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.895860 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898598 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898788 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898833 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898878 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.898937 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") pod \"7d73d9d0-5632-47a3-93e0-899f64f51011\" (UID: \"7d73d9d0-5632-47a3-93e0-899f64f51011\") " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899485 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899505 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899517 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grm8x\" (UniqueName: \"kubernetes.io/projected/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-kube-api-access-grm8x\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899535 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899545 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1dd25da-51d6-45f0-b70c-f1baa17d2da3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899554 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899564 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899572 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ddca0-76d0-4dce-9983-4b07655adc25-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899582 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899593 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8b85\" (UniqueName: \"kubernetes.io/projected/e95ddca0-76d0-4dce-9983-4b07655adc25-kube-api-access-c8b85\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899603 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dlr4\" (UniqueName: \"kubernetes.io/projected/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-api-access-6dlr4\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899612 4816 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a22173f-147b-46ac-bb01-596fe9f12b10-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899622 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bd939d8-3b22-4496-acea-ac527f3e5149-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899632 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/9a22173f-147b-46ac-bb01-596fe9f12b10-kube-api-access-cnzpb\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899641 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbrs4\" (UniqueName: \"kubernetes.io/projected/7bd939d8-3b22-4496-acea-ac527f3e5149-kube-api-access-xbrs4\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.899650 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.906978 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:25 crc kubenswrapper[4816]: E0311 12:22:25.907072 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.913569 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs" (OuterVolumeSpecName: "logs") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.958736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f" (OuterVolumeSpecName: "kube-api-access-nb22f") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "kube-api-access-nb22f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.963200 4816 generic.go:334] "Generic (PLEG): container finished" podID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerID="8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749" exitCode=0 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.963460 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerDied","Data":"8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.965558 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4bcf-account-create-update-nv5hk" event={"ID":"b1dd25da-51d6-45f0-b70c-f1baa17d2da3","Type":"ContainerDied","Data":"1d267594122bbe4cf05c9b26645399ea847bc2099ad10ee8bb693c8e2675f8e5"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.965633 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4bcf-account-create-update-nv5hk" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.982854 4816 generic.go:334] "Generic (PLEG): container finished" podID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerID="9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" exitCode=0 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.982969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerDied","Data":"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.983003 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e95ddca0-76d0-4dce-9983-4b07655adc25","Type":"ContainerDied","Data":"f13badcbc5010cfb4035a99958d3aebf412aeabedcc2f776bca112d761fa63de"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.983098 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.984433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-snf5b" event={"ID":"3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb","Type":"ContainerDied","Data":"37dc46cbca9b814e026266eb10b0888ee0b98d2b5a77de8a934c3e1d5742969a"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.984510 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-snf5b" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.987651 4816 generic.go:334] "Generic (PLEG): container finished" podID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerID="05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" exitCode=0 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.987851 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.988440 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerDied","Data":"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.988472 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d73d9d0-5632-47a3-93e0-899f64f51011","Type":"ContainerDied","Data":"c81825bf2b4be781ea36bdb64201016c8530a7353fa6d58d50264ccf72608bde"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.992517 4816 generic.go:334] "Generic (PLEG): container finished" podID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerID="87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" exitCode=2 Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.992611 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32dcc96b-186a-444d-bef3-4c5f117ee652","Type":"ContainerDied","Data":"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.992652 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32dcc96b-186a-444d-bef3-4c5f117ee652","Type":"ContainerDied","Data":"1e343e65b4d8cc4645e88fc1c1a55d93ec648ea21d55e4018feab7481fc909e7"} Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.992743 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 12:22:25 crc kubenswrapper[4816]: I0311 12:22:25.995605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bvvkj" event={"ID":"2d60557e-d939-46bf-8a60-641016b4d68d","Type":"ContainerStarted","Data":"8ea9826afd6446a559af78c72ed8d7f368b8a030b60ae6f7af907a7806773c5c"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.004581 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb22f\" (UniqueName: \"kubernetes.io/projected/7d73d9d0-5632-47a3-93e0-899f64f51011-kube-api-access-nb22f\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.004677 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d73d9d0-5632-47a3-93e0-899f64f51011-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025083 4816 generic.go:334] "Generic (PLEG): container finished" podID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerID="a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025138 4816 generic.go:334] "Generic (PLEG): container finished" podID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerID="824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5" exitCode=2 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025147 4816 generic.go:334] "Generic (PLEG): container finished" podID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerID="a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025156 4816 generic.go:334] "Generic (PLEG): container finished" podID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerID="84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025224 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025277 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025291 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.025301 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.027851 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ffd6fb588-7hftz" event={"ID":"7bd939d8-3b22-4496-acea-ac527f3e5149","Type":"ContainerDied","Data":"584cd4107522305bdba692719070a92eec3324ee2da427663b64c0c877cbea0c"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.027965 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ffd6fb588-7hftz" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.031849 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9a22173f-147b-46ac-bb01-596fe9f12b10","Type":"ContainerDied","Data":"ef8afb38cbe161f1b81f860d56715a732c9c137776bc40df909c84b5acbd4154"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.031959 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.035272 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c94c19c-3ccb-43cc-ab41-92baa3141f73","Type":"ContainerDied","Data":"601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.035305 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601d8bfb1ac6479d4e58832dfee18035d25eae3e88360d11ef1513118c0bd2f3" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.039519 4816 generic.go:334] "Generic (PLEG): container finished" podID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerID="f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.039605 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerDied","Data":"f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.039641 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d28745d2-082d-4c99-90f0-b6c4696fb1a2","Type":"ContainerDied","Data":"8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.039655 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5fef237ae36daf657628ae1e951a8f33300f04ba146b0b7c82c1251a514014" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.051270 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.052438 4816 generic.go:334] "Generic (PLEG): container finished" podID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerID="4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.053410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerDied","Data":"4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.057746 4816 generic.go:334] "Generic (PLEG): container finished" podID="7795071e-2de0-43cb-b225-cfed54570d94" containerID="5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2" exitCode=0 Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.057828 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.058530 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerDied","Data":"5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.058568 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b59f8d4-2vxd9" event={"ID":"7795071e-2de0-43cb-b225-cfed54570d94","Type":"ContainerDied","Data":"9de7e47c0f14568909f59552b05e938af6254c4c9840ec07004683a8c3fa16e2"} Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.058585 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9de7e47c0f14568909f59552b05e938af6254c4c9840ec07004683a8c3fa16e2" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.068492 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "32dcc96b-186a-444d-bef3-4c5f117ee652" (UID: "32dcc96b-186a-444d-bef3-4c5f117ee652"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.069093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.095082 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.100548 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "32dcc96b-186a-444d-bef3-4c5f117ee652" (UID: "32dcc96b-186a-444d-bef3-4c5f117ee652"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114336 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114363 4816 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114374 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114384 4816 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcc96b-186a-444d-bef3-4c5f117ee652-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.114395 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.137500 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data" (OuterVolumeSpecName: "config-data") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.151972 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.169655 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data" (OuterVolumeSpecName: "config-data") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.170685 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.178722 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7d73d9d0-5632-47a3-93e0-899f64f51011" (UID: "7d73d9d0-5632-47a3-93e0-899f64f51011"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.179697 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ce1ef6-fcd0-4182-afca-22c5892b48e2" path="/var/lib/kubelet/pods/09ce1ef6-fcd0-4182-afca-22c5892b48e2/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.180611 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56" path="/var/lib/kubelet/pods/45e6e9e0-bfd4-4e8d-823b-9e2bfdfe6d56/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.181387 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e637fcd-e45c-479c-856d-086d642af3bb" path="/var/lib/kubelet/pods/5e637fcd-e45c-479c-856d-086d642af3bb/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.183548 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625b367b-084e-4cf8-8c30-5d4df9c696f9" path="/var/lib/kubelet/pods/625b367b-084e-4cf8-8c30-5d4df9c696f9/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.184454 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742cfc03-0365-4df8-a7f6-e6eac11ba045" path="/var/lib/kubelet/pods/742cfc03-0365-4df8-a7f6-e6eac11ba045/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.185135 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47c9b57-0735-415f-a1a1-4b3096e3fbcf" path="/var/lib/kubelet/pods/c47c9b57-0735-415f-a1a1-4b3096e3fbcf/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.185816 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd535a1-7585-4cb7-94ec-f4b98b10be4a" path="/var/lib/kubelet/pods/ddd535a1-7585-4cb7-94ec-f4b98b10be4a/volumes" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.194727 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data" (OuterVolumeSpecName: "config-data") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.207797 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9a22173f-147b-46ac-bb01-596fe9f12b10" (UID: "9a22173f-147b-46ac-bb01-596fe9f12b10"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218287 4816 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218328 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218337 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218350 4816 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a22173f-147b-46ac-bb01-596fe9f12b10-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218360 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218370 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.218378 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d73d9d0-5632-47a3-93e0-899f64f51011-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.234419 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e95ddca0-76d0-4dce-9983-4b07655adc25" (UID: "e95ddca0-76d0-4dce-9983-4b07655adc25"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.310205 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.308158 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.327330 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ddca0-76d0-4dce-9983-4b07655adc25-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.329066 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.331700 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.363069 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.363346 4816 scope.go:117] "RemoveContainer" containerID="5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.379020 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.400561 4816 scope.go:117] "RemoveContainer" containerID="a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.416734 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207\": container with ID starting with a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207 not found: ID does not exist" containerID="a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.416819 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207"} err="failed to get container status \"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207\": rpc error: code = NotFound desc = could not find container \"a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207\": container with ID starting with a2fe652a36263402ff94fa1d4ec821be087bc6255f2da08fbe025571394de207 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.416851 4816 scope.go:117] "RemoveContainer" containerID="5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.417943 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85\": container with ID starting with 5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85 not found: ID does not exist" containerID="5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.418034 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85"} err="failed to get container status \"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85\": rpc error: code = NotFound desc = could not find container \"5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85\": container with ID starting with 5cfae0145ad988b78f57674ae7aa14b5835657d9dac7b0c977c144c0d4304d85 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.418067 4816 scope.go:117] "RemoveContainer" containerID="9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.422710 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.432775 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.432877 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.432931 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.432961 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.433105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.433199 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.433232 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.434347 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.435332 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs" (OuterVolumeSpecName: "logs") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.433328 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.444003 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454586 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454685 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454713 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454751 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454775 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454841 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") pod \"7795071e-2de0-43cb-b225-cfed54570d94\" (UID: \"7795071e-2de0-43cb-b225-cfed54570d94\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454872 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.454929 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") pod \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\" (UID: \"1c94c19c-3ccb-43cc-ab41-92baa3141f73\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.455664 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.455771 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs" (OuterVolumeSpecName: "logs") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.455844 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.455989 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c94c19c-3ccb-43cc-ab41-92baa3141f73-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.456055 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c94c19c-3ccb-43cc-ab41-92baa3141f73-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.455947 4816 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.456238 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:28.456213725 +0000 UTC m=+1435.047477692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : configmap "openstack-scripts" not found Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.456668 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bd939d8-3b22-4496-acea-ac527f3e5149" (UID: "7bd939d8-3b22-4496-acea-ac527f3e5149"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.459197 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.460613 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.471035 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.474018 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.474575 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.478554 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.485319 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.488092 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.490194 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.490424 4816 scope.go:117] "RemoveContainer" containerID="c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.491583 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.491687 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.499560 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts" (OuterVolumeSpecName: "scripts") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.500138 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs" (OuterVolumeSpecName: "kube-api-access-g5svs") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "kube-api-access-g5svs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.501670 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4bcf-account-create-update-nv5hk"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.514000 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.519274 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j" (OuterVolumeSpecName: "kube-api-access-mgm8j") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "kube-api-access-mgm8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.526070 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.545596 4816 scope.go:117] "RemoveContainer" containerID="9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.558979 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764\": container with ID starting with 9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764 not found: ID does not exist" containerID="9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559063 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764"} err="failed to get container status \"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764\": rpc error: code = NotFound desc = could not find container \"9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764\": container with ID starting with 9dfd5d9de37a643541d7d99bf2ad8ffbb190d4d99b4400e1d3e559828813b764 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559113 4816 scope.go:117] "RemoveContainer" containerID="c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559774 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559847 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.559950 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560012 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560070 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560103 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560142 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560177 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560266 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560297 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560323 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560362 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560390 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560420 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560474 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560521 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") pod \"2d60557e-d939-46bf-8a60-641016b4d68d\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560562 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") pod \"2d60557e-d939-46bf-8a60-641016b4d68d\" (UID: \"2d60557e-d939-46bf-8a60-641016b4d68d\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560657 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560709 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560743 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560780 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560812 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") pod \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\" (UID: \"bedb612d-0e22-4025-9151-d0cf7bc4ee42\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560838 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560866 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.560925 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") pod \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\" (UID: \"d28745d2-082d-4c99-90f0-b6c4696fb1a2\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561001 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561044 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561134 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") pod \"b79e89c6-5f56-4439-ad63-a86259d4ed29\" (UID: \"b79e89c6-5f56-4439-ad63-a86259d4ed29\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561188 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") pod \"7457f2db-7979-4d92-bd90-a1464b8a3878\" (UID: \"7457f2db-7979-4d92-bd90-a1464b8a3878\") " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") pod \"keystone-9b21-account-create-update-cmcfl\" (UID: \"1da70aee-e1eb-4ad5-b0de-1e2f988dd729\") " pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561810 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgm8j\" (UniqueName: \"kubernetes.io/projected/7795071e-2de0-43cb-b225-cfed54570d94-kube-api-access-mgm8j\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561840 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd939d8-3b22-4496-acea-ac527f3e5149-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561855 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561868 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561880 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561934 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7795071e-2de0-43cb-b225-cfed54570d94-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.561950 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5svs\" (UniqueName: \"kubernetes.io/projected/1c94c19c-3ccb-43cc-ab41-92baa3141f73-kube-api-access-g5svs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.562171 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.562211 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.562330 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data podName:26aea2df-f497-478d-b953-060189ef2569 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:34.56223497 +0000 UTC m=+1441.153498937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data") pod "rabbitmq-server-0" (UID: "26aea2df-f497-478d-b953-060189ef2569") : configmap "rabbitmq-config-data" not found Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.567328 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm" (OuterVolumeSpecName: "kube-api-access-klbjm") pod "2d60557e-d939-46bf-8a60-641016b4d68d" (UID: "2d60557e-d939-46bf-8a60-641016b4d68d"). InnerVolumeSpecName "kube-api-access-klbjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.560008 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25\": container with ID starting with c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25 not found: ID does not exist" containerID="c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.567401 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25"} err="failed to get container status \"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25\": rpc error: code = NotFound desc = could not find container \"c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25\": container with ID starting with c98a4983c1c555c8104fb916b00cb391571c199b1e9301413191c24c4a358d25 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.567437 4816 scope.go:117] "RemoveContainer" containerID="05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.569086 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d60557e-d939-46bf-8a60-641016b4d68d" (UID: "2d60557e-d939-46bf-8a60-641016b4d68d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.576111 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs" (OuterVolumeSpecName: "logs") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.585795 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.585890 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs" (OuterVolumeSpecName: "logs") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.586028 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.591235 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs" (OuterVolumeSpecName: "logs") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.592908 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts" (OuterVolumeSpecName: "scripts") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.595013 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-snf5b"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.595037 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.600956 4816 projected.go:194] Error preparing data for projected volume kube-api-access-24z58 for pod openstack/keystone-9b21-account-create-update-cmcfl: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.601048 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58 podName:1da70aee-e1eb-4ad5-b0de-1e2f988dd729 nodeName:}" failed. No retries permitted until 2026-03-11 12:22:28.601021324 +0000 UTC m=+1435.192285351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-24z58" (UniqueName: "kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58") pod "keystone-9b21-account-create-update-cmcfl" (UID: "1da70aee-e1eb-4ad5-b0de-1e2f988dd729") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.611691 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.618021 4816 scope.go:117] "RemoveContainer" containerID="494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.622448 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.630514 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts" (OuterVolumeSpecName: "scripts") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.630784 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8" (OuterVolumeSpecName: "kube-api-access-cqhs8") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "kube-api-access-cqhs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.636539 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz" (OuterVolumeSpecName: "kube-api-access-v4nrz") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "kube-api-access-v4nrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.637293 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.638631 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l" (OuterVolumeSpecName: "kube-api-access-g5r6l") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "kube-api-access-g5r6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.640620 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.643452 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.648470 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664322 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79e89c6-5f56-4439-ad63-a86259d4ed29-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664365 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664468 4816 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664577 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664593 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqhs8\" (UniqueName: \"kubernetes.io/projected/d28745d2-082d-4c99-90f0-b6c4696fb1a2-kube-api-access-cqhs8\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664607 4816 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bedb612d-0e22-4025-9151-d0cf7bc4ee42-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664620 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664632 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klbjm\" (UniqueName: \"kubernetes.io/projected/2d60557e-d939-46bf-8a60-641016b4d68d-kube-api-access-klbjm\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664643 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d60557e-d939-46bf-8a60-641016b4d68d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664655 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5r6l\" (UniqueName: \"kubernetes.io/projected/bedb612d-0e22-4025-9151-d0cf7bc4ee42-kube-api-access-g5r6l\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664666 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664677 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d28745d2-082d-4c99-90f0-b6c4696fb1a2-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664968 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.664990 4816 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7457f2db-7979-4d92-bd90-a1464b8a3878-logs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.665003 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4nrz\" (UniqueName: \"kubernetes.io/projected/b79e89c6-5f56-4439-ad63-a86259d4ed29-kube-api-access-v4nrz\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.669908 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs" (OuterVolumeSpecName: "kube-api-access-rr9hs") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "kube-api-access-rr9hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.692088 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.693043 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.693264 4816 scope.go:117] "RemoveContainer" containerID="05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.695924 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2\": container with ID starting with 05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2 not found: ID does not exist" containerID="05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.695963 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2"} err="failed to get container status \"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2\": rpc error: code = NotFound desc = could not find container \"05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2\": container with ID starting with 05538dd985ad20fb55582d69a35b969743ae902043cfc0d0fe6e1bf963056eb2 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.695987 4816 scope.go:117] "RemoveContainer" containerID="494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.699683 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191\": container with ID starting with 494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191 not found: ID does not exist" containerID="494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.699927 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191"} err="failed to get container status \"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191\": rpc error: code = NotFound desc = could not find container \"494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191\": container with ID starting with 494b7c934e67413331c33cbc35a1ab84e1195c496bffebeeb4ea4a3917bff191 not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.699975 4816 scope.go:117] "RemoveContainer" containerID="87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.705487 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.718153 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.737612 4816 scope.go:117] "RemoveContainer" containerID="87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.738205 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c\": container with ID starting with 87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c not found: ID does not exist" containerID="87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.738342 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c"} err="failed to get container status \"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c\": rpc error: code = NotFound desc = could not find container \"87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c\": container with ID starting with 87aa2b1e91cb5a822ed7cf28348c0737eb6cfc59a0a44ee9905ee11d4719f35c not found: ID does not exist" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.738392 4816 scope.go:117] "RemoveContainer" containerID="6309388e250c5434fd6b39ddce96cacd594c9880dd57d2c9e89074cac30a961b" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.767234 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr9hs\" (UniqueName: \"kubernetes.io/projected/7457f2db-7979-4d92-bd90-a1464b8a3878-kube-api-access-rr9hs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.767301 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.767313 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.779073 4816 scope.go:117] "RemoveContainer" containerID="3acd68e155620ecc4260fb5ba2dfe8af8d211b5066fc4c67c7f8658e47beb43f" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.805558 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.816434 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.834406 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.868321 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.869240 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.871845 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.871865 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.871773 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.879841 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5ffd6fb588-7hftz"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.890096 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data" (OuterVolumeSpecName: "config-data") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.919726 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.924540 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.925646 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.927153 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 11 12:22:26 crc kubenswrapper[4816]: E0311 12:22:26.927246 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.929270 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.972349 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974441 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974471 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974484 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974494 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:26 crc kubenswrapper[4816]: I0311 12:22:26.974507 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.003868 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.005762 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.013093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data" (OuterVolumeSpecName: "config-data") pod "7795071e-2de0-43cb-b225-cfed54570d94" (UID: "7795071e-2de0-43cb-b225-cfed54570d94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.017272 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.023896 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.024321 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.032571 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data" (OuterVolumeSpecName: "config-data") pod "7457f2db-7979-4d92-bd90-a1464b8a3878" (UID: "7457f2db-7979-4d92-bd90-a1464b8a3878"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.036898 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.039162 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data" (OuterVolumeSpecName: "config-data") pod "1c94c19c-3ccb-43cc-ab41-92baa3141f73" (UID: "1c94c19c-3ccb-43cc-ab41-92baa3141f73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.053776 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d28745d2-082d-4c99-90f0-b6c4696fb1a2" (UID: "d28745d2-082d-4c99-90f0-b6c4696fb1a2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076231 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076276 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076288 4816 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076297 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d28745d2-082d-4c99-90f0-b6c4696fb1a2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076306 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076316 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7457f2db-7979-4d92-bd90-a1464b8a3878-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076324 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076332 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7795071e-2de0-43cb-b225-cfed54570d94-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076341 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c94c19c-3ccb-43cc-ab41-92baa3141f73-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.076349 4816 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.084260 4816 generic.go:334] "Generic (PLEG): container finished" podID="63567eba-cc2a-4168-9e81-51c1daed5482" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" exitCode=0 Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.084508 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"63567eba-cc2a-4168-9e81-51c1daed5482","Type":"ContainerDied","Data":"adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.084786 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"63567eba-cc2a-4168-9e81-51c1daed5482","Type":"ContainerDied","Data":"188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.084865 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.085448 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data" (OuterVolumeSpecName: "config-data") pod "b79e89c6-5f56-4439-ad63-a86259d4ed29" (UID: "b79e89c6-5f56-4439-ad63-a86259d4ed29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.095620 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.095645 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bedb612d-0e22-4025-9151-d0cf7bc4ee42","Type":"ContainerDied","Data":"c1f12afb3ed2335d5b28ac089b50b4a7d4f0e38f3d3c1e7e1f537108eabd58b9"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.098289 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data" (OuterVolumeSpecName: "config-data") pod "bedb612d-0e22-4025-9151-d0cf7bc4ee42" (UID: "bedb612d-0e22-4025-9151-d0cf7bc4ee42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.109846 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bvvkj" event={"ID":"2d60557e-d939-46bf-8a60-641016b4d68d","Type":"ContainerDied","Data":"8ea9826afd6446a559af78c72ed8d7f368b8a030b60ae6f7af907a7806773c5c"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.109906 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bvvkj" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.112496 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7457f2db-7979-4d92-bd90-a1464b8a3878","Type":"ContainerDied","Data":"722d37999c6fc7f3ffe4d8bb991503dcb67968fd32e0b13507be34c65c4fb635"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.112696 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.135445 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-855897fd55-t7sfb" event={"ID":"b79e89c6-5f56-4439-ad63-a86259d4ed29","Type":"ContainerDied","Data":"65e8dd7e6335c0228a44e94f23c28e5cede1dd965bd20e6b4cf61bc69bb5386a"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.135514 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-855897fd55-t7sfb" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.150934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5030028c-f574-4334-a837-2430761524b4","Type":"ContainerDied","Data":"0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.150958 4816 generic.go:334] "Generic (PLEG): container finished" podID="5030028c-f574-4334-a837-2430761524b4" containerID="0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637" exitCode=0 Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151139 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b21-account-create-update-cmcfl" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151129 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5030028c-f574-4334-a837-2430761524b4","Type":"ContainerDied","Data":"6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731"} Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151191 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151198 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b59f8d4-2vxd9" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151199 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.151925 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.179805 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedb612d-0e22-4025-9151-d0cf7bc4ee42-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.179830 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79e89c6-5f56-4439-ad63-a86259d4ed29-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.231797 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.260417 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.309565 4816 scope.go:117] "RemoveContainer" containerID="08358819a244a822957b7c7153f37ef3fa2c0371fe913be221e0cf6e09e89054" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.331009 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.337179 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.370534 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.371251 4816 scope.go:117] "RemoveContainer" containerID="90224f5e31cd4408489a5dec30ffa77147f611b179c23e40a3d0104504542a1b" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.380938 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bvvkj"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382148 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382179 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382250 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382409 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") pod \"63567eba-cc2a-4168-9e81-51c1daed5482\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382568 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382646 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") pod \"63567eba-cc2a-4168-9e81-51c1daed5482\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382682 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") pod \"5030028c-f574-4334-a837-2430761524b4\" (UID: \"5030028c-f574-4334-a837-2430761524b4\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.382727 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") pod \"63567eba-cc2a-4168-9e81-51c1daed5482\" (UID: \"63567eba-cc2a-4168-9e81-51c1daed5482\") " Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.383194 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.384314 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data" (OuterVolumeSpecName: "config-data") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.458283 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd" (OuterVolumeSpecName: "kube-api-access-d7gcd") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "kube-api-access-d7gcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.460639 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9" (OuterVolumeSpecName: "kube-api-access-jcvt9") pod "63567eba-cc2a-4168-9e81-51c1daed5482" (UID: "63567eba-cc2a-4168-9e81-51c1daed5482"). InnerVolumeSpecName "kube-api-access-jcvt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.473563 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data" (OuterVolumeSpecName: "config-data") pod "63567eba-cc2a-4168-9e81-51c1daed5482" (UID: "63567eba-cc2a-4168-9e81-51c1daed5482"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487484 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcvt9\" (UniqueName: \"kubernetes.io/projected/63567eba-cc2a-4168-9e81-51c1daed5482-kube-api-access-jcvt9\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487517 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487529 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487541 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7gcd\" (UniqueName: \"kubernetes.io/projected/5030028c-f574-4334-a837-2430761524b4-kube-api-access-d7gcd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.487552 4816 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5030028c-f574-4334-a837-2430761524b4-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.531519 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.553778 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63567eba-cc2a-4168-9e81-51c1daed5482" (UID: "63567eba-cc2a-4168-9e81-51c1daed5482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.560185 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.589730 4816 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.589767 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63567eba-cc2a-4168-9e81-51c1daed5482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.615487 4816 scope.go:117] "RemoveContainer" containerID="a80048ca909856187d3fa5dac7b542ba5ca3c8dbcb582537e0f884c753db4809" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.630395 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5030028c-f574-4334-a837-2430761524b4" (UID: "5030028c-f574-4334-a837-2430761524b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.679087 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9b21-account-create-update-cmcfl"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.691187 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5030028c-f574-4334-a837-2430761524b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.696972 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.706373 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.710607 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.719342 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-855897fd55-t7sfb"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.728388 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.733149 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-64b59f8d4-2vxd9"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.744326 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.751439 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.760877 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: E0311 12:22:27.765057 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.774925 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.794727 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24z58\" (UniqueName: \"kubernetes.io/projected/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-kube-api-access-24z58\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.794765 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1da70aee-e1eb-4ad5-b0de-1e2f988dd729-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:27 crc kubenswrapper[4816]: E0311 12:22:27.802462 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:27 crc kubenswrapper[4816]: E0311 12:22:27.829058 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 12:22:27 crc kubenswrapper[4816]: E0311 12:22:27.829152 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.875382 4816 scope.go:117] "RemoveContainer" containerID="824fd644293ef663ba362cace1b788aa52143866b3de49d3b2f15202714957b5" Mar 11 12:22:27 crc kubenswrapper[4816]: I0311 12:22:27.980150 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.023987 4816 scope.go:117] "RemoveContainer" containerID="a8b3b1241d87a2bc94cda4c45011262eeb879b9fb212362f754599d92ce27242" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.098366 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.099379 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.099529 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.099608 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.100564 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104175 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104340 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.099639 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104543 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104624 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.104682 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"da177cde-6332-4562-809a-d4bee453cebf\" (UID: \"da177cde-6332-4562-809a-d4bee453cebf\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.105274 4816 scope.go:117] "RemoveContainer" containerID="84c64e2c11b5a33088d3e50d684b62246b9937fb898429fa525cc6fb739d9015" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.105807 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.107120 4816 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.107197 4816 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.107210 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da177cde-6332-4562-809a-d4bee453cebf-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.107222 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da177cde-6332-4562-809a-d4bee453cebf-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.109121 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c71feeeb-a44d-42ec-a4c7-ddbf9a76f825/ovn-northd/0.log" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.109273 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.113600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx" (OuterVolumeSpecName: "kube-api-access-txrmx") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "kube-api-access-txrmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.138451 4816 scope.go:117] "RemoveContainer" containerID="8ba3c9d212f5a9f10887e454eabe42340558258c07c8285eb982b69803aa3749" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.151698 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.165027 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" path="/var/lib/kubelet/pods/1c94c19c-3ccb-43cc-ab41-92baa3141f73/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.165963 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da70aee-e1eb-4ad5-b0de-1e2f988dd729" path="/var/lib/kubelet/pods/1da70aee-e1eb-4ad5-b0de-1e2f988dd729/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.166440 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d60557e-d939-46bf-8a60-641016b4d68d" path="/var/lib/kubelet/pods/2d60557e-d939-46bf-8a60-641016b4d68d/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.167166 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" path="/var/lib/kubelet/pods/32dcc96b-186a-444d-bef3-4c5f117ee652/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.168497 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb" path="/var/lib/kubelet/pods/3d936c6f-e3a7-4ffe-ae3c-1ef6b7ff31bb/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.169129 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" path="/var/lib/kubelet/pods/7457f2db-7979-4d92-bd90-a1464b8a3878/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.170062 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.170371 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7795071e-2de0-43cb-b225-cfed54570d94" path="/var/lib/kubelet/pods/7795071e-2de0-43cb-b225-cfed54570d94/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.171929 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" path="/var/lib/kubelet/pods/7bd939d8-3b22-4496-acea-ac527f3e5149/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.172870 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" path="/var/lib/kubelet/pods/7d73d9d0-5632-47a3-93e0-899f64f51011/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.174016 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" path="/var/lib/kubelet/pods/9a22173f-147b-46ac-bb01-596fe9f12b10/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.177636 4816 generic.go:334] "Generic (PLEG): container finished" podID="26aea2df-f497-478d-b953-060189ef2569" containerID="0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7" exitCode=0 Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.178741 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1dd25da-51d6-45f0-b70c-f1baa17d2da3" path="/var/lib/kubelet/pods/b1dd25da-51d6-45f0-b70c-f1baa17d2da3/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.180333 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" path="/var/lib/kubelet/pods/b79e89c6-5f56-4439-ad63-a86259d4ed29/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.180902 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c71feeeb-a44d-42ec-a4c7-ddbf9a76f825/ovn-northd/0.log" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.180929 4816 generic.go:334] "Generic (PLEG): container finished" podID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" exitCode=139 Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.181010 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.181299 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" path="/var/lib/kubelet/pods/bedb612d-0e22-4025-9151-d0cf7bc4ee42/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.190462 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "da177cde-6332-4562-809a-d4bee453cebf" (UID: "da177cde-6332-4562-809a-d4bee453cebf"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.190508 4816 scope.go:117] "RemoveContainer" containerID="c020c8caff09b112c5e61167611361a425a1b4a92367fbbd7dbf97390e021cca" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.191038 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" path="/var/lib/kubelet/pods/d28745d2-082d-4c99-90f0-b6c4696fb1a2/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.193033 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" path="/var/lib/kubelet/pods/e95ddca0-76d0-4dce-9983-4b07655adc25/volumes" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.205999 4816 generic.go:334] "Generic (PLEG): container finished" podID="da177cde-6332-4562-809a-d4bee453cebf" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" exitCode=0 Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.206588 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.206200 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerDied","Data":"0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207428 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerDied","Data":"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207466 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825","Type":"ContainerDied","Data":"25d4f9ece0205331680bd83d3d312fa201b0497bc9a8a61346652664c99b99e2"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207479 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerDied","Data":"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207492 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"da177cde-6332-4562-809a-d4bee453cebf","Type":"ContainerDied","Data":"c1304c6acbe0151fcfd1f27a9fb0f616c29bb18a4876bb3def66924a603536ea"} Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.207603 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.211499 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.211915 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.213510 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214110 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214333 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214464 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.214684 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") pod \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\" (UID: \"c71feeeb-a44d-42ec-a4c7-ddbf9a76f825\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.215200 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.215648 4816 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da177cde-6332-4562-809a-d4bee453cebf-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.216111 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txrmx\" (UniqueName: \"kubernetes.io/projected/da177cde-6332-4562-809a-d4bee453cebf-kube-api-access-txrmx\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.216323 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.219279 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts" (OuterVolumeSpecName: "scripts") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.220978 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.227411 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config" (OuterVolumeSpecName: "config") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.231993 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4" (OuterVolumeSpecName: "kube-api-access-7hft4") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "kube-api-access-7hft4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.249715 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.257763 4816 scope.go:117] "RemoveContainer" containerID="4e741a528a024acf7a27b5a7253bef28cff4a22ea41c625ba24158e8c7be76eb" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.261977 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.295767 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.312074 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" (UID: "c71feeeb-a44d-42ec-a4c7-ddbf9a76f825"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.319231 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.319510 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hft4\" (UniqueName: \"kubernetes.io/projected/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-kube-api-access-7hft4\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.319597 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.319826 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.320104 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.320806 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.321018 4816 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.321191 4816 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.361198 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.366837 4816 scope.go:117] "RemoveContainer" containerID="f675def681ebf7bc955ad7437f5bae6532f22f4db4a832aa48a182650e749af2" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.376476 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.394451 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.418202 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.428600 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.437839 4816 scope.go:117] "RemoveContainer" containerID="6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.482904 4816 scope.go:117] "RemoveContainer" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526145 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526229 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526506 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526543 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526568 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.518944 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63567eba_cc2a_4168_9e81_51c1daed5482.slice/crio-188caecd38c19e3561f318ca76a8032bcaad31be23f5090529c90fb8dfd7f7e7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda177cde_6332_4562_809a_d4bee453cebf.slice/crio-c1304c6acbe0151fcfd1f27a9fb0f616c29bb18a4876bb3def66924a603536ea\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5030028c_f574_4334_a837_2430761524b4.slice/crio-6419a001ec72fffb18fae89ec5268f12610ae0c656da26d1ec1980d99bf8c731\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c180505_72c6_498d_bfa5_05f689692bd2.slice/crio-40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5030028c_f574_4334_a837_2430761524b4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c180505_72c6_498d_bfa5_05f689692bd2.slice/crio-conmon-40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63567eba_cc2a_4168_9e81_51c1daed5482.slice\": RecentStats: unable to find data in memory cache]" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.529073 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.529771 4816 scope.go:117] "RemoveContainer" containerID="6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.526610 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530093 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530181 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530369 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530472 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.530510 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") pod \"26aea2df-f497-478d-b953-060189ef2569\" (UID: \"26aea2df-f497-478d-b953-060189ef2569\") " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.531241 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.532610 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.532942 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.538664 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.541072 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7\": container with ID starting with 6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7 not found: ID does not exist" containerID="6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.541190 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7"} err="failed to get container status \"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7\": rpc error: code = NotFound desc = could not find container \"6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7\": container with ID starting with 6e5d751e1033e9d4aef5824d4c13d38308132b4b6b9a60ec26d78186a278dab7 not found: ID does not exist" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.541237 4816 scope.go:117] "RemoveContainer" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.545626 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7\": container with ID starting with 8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7 not found: ID does not exist" containerID="8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.545688 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7"} err="failed to get container status \"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7\": rpc error: code = NotFound desc = could not find container \"8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7\": container with ID starting with 8a2953b83fad75911a9aa3b9b53086764c650fc4022cbafe1b2e60fde2fe5be7 not found: ID does not exist" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.545732 4816 scope.go:117] "RemoveContainer" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.546798 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info" (OuterVolumeSpecName: "pod-info") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.554386 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.558918 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl" (OuterVolumeSpecName: "kube-api-access-dv8dl") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "kube-api-access-dv8dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.559516 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data" (OuterVolumeSpecName: "config-data") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.578972 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.579478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.585888 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.601583 4816 scope.go:117] "RemoveContainer" containerID="933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.602361 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf" (OuterVolumeSpecName: "server-conf") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.637822 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638274 4816 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/26aea2df-f497-478d-b953-060189ef2569-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638330 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638344 4816 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638435 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv8dl\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-kube-api-access-dv8dl\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638459 4816 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/26aea2df-f497-478d-b953-060189ef2569-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638528 4816 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/26aea2df-f497-478d-b953-060189ef2569-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638542 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.638594 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.692561 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "26aea2df-f497-478d-b953-060189ef2569" (UID: "26aea2df-f497-478d-b953-060189ef2569"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.693681 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.718588 4816 scope.go:117] "RemoveContainer" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.719428 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492\": container with ID starting with c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492 not found: ID does not exist" containerID="c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.719498 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492"} err="failed to get container status \"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492\": rpc error: code = NotFound desc = could not find container \"c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492\": container with ID starting with c63ed4d8962eaade5fdd56e19833812eb68982f5e9c4239e8a03e5077a42a492 not found: ID does not exist" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.719566 4816 scope.go:117] "RemoveContainer" containerID="933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.719888 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03\": container with ID starting with 933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03 not found: ID does not exist" containerID="933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.719916 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03"} err="failed to get container status \"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03\": rpc error: code = NotFound desc = could not find container \"933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03\": container with ID starting with 933bdb7df24ef397f527e8ac441de2b3a2e82c07c8ab31ea86b61c45f7139f03 not found: ID does not exist" Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.741981 4816 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:28 crc kubenswrapper[4816]: E0311 12:22:28.742111 4816 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data podName:3779c0f5-9084-4c07-83d9-fe2017559f7b nodeName:}" failed. No retries permitted until 2026-03-11 12:22:36.742081389 +0000 UTC m=+1443.333345386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b") : configmap "rabbitmq-cell1-config-data" not found Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.743679 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/26aea2df-f497-478d-b953-060189ef2569-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.743710 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:28 crc kubenswrapper[4816]: I0311 12:22:28.944562 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.052903 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.052973 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053033 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053080 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053143 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053172 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053269 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.053339 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") pod \"9c180505-72c6-498d-bfa5-05f689692bd2\" (UID: \"9c180505-72c6-498d-bfa5-05f689692bd2\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.069113 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j" (OuterVolumeSpecName: "kube-api-access-6zc8j") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "kube-api-access-6zc8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.078366 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts" (OuterVolumeSpecName: "scripts") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.079026 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.079737 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.105187 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data" (OuterVolumeSpecName: "config-data") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.117804 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.126519 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156721 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zc8j\" (UniqueName: \"kubernetes.io/projected/9c180505-72c6-498d-bfa5-05f689692bd2-kube-api-access-6zc8j\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156762 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156773 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156783 4816 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156792 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.156803 4816 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.175093 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.185503 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9c180505-72c6-498d-bfa5-05f689692bd2" (UID: "9c180505-72c6-498d-bfa5-05f689692bd2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.242944 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.242955 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"26aea2df-f497-478d-b953-060189ef2569","Type":"ContainerDied","Data":"bd5f7144adb25f2d3d74b32cee4ef0069fc612e5f70830fc738cf8898c918056"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.243069 4816 scope.go:117] "RemoveContainer" containerID="0735cf7e4268f5297289dcfc433ce805028b2098230211ba63ceb121fac25ec7" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257583 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257637 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257659 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257702 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257722 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257751 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257809 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257854 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257901 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257930 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.257965 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") pod \"3779c0f5-9084-4c07-83d9-fe2017559f7b\" (UID: \"3779c0f5-9084-4c07-83d9-fe2017559f7b\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.258359 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.258377 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c180505-72c6-498d-bfa5-05f689692bd2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.258960 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.260851 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.260905 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.266353 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.267056 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.267157 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info" (OuterVolumeSpecName: "pod-info") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.268182 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.270993 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95" (OuterVolumeSpecName: "kube-api-access-mvr95") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "kube-api-access-mvr95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.273572 4816 generic.go:334] "Generic (PLEG): container finished" podID="41f4b502-b85f-488c-b55b-27a31479df68" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" exitCode=0 Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.273675 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"41f4b502-b85f-488c-b55b-27a31479df68","Type":"ContainerDied","Data":"60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.275189 4816 generic.go:334] "Generic (PLEG): container finished" podID="9c180505-72c6-498d-bfa5-05f689692bd2" containerID="40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" exitCode=0 Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.275248 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d6ddcd789-qjf9c" event={"ID":"9c180505-72c6-498d-bfa5-05f689692bd2","Type":"ContainerDied","Data":"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.275290 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d6ddcd789-qjf9c" event={"ID":"9c180505-72c6-498d-bfa5-05f689692bd2","Type":"ContainerDied","Data":"210d5da4467eeb407cc3db147ba87bbb3dfcf68d3ca56b768383a1d9ec2cdc8a"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.275371 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d6ddcd789-qjf9c" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.280465 4816 generic.go:334] "Generic (PLEG): container finished" podID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerID="18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" exitCode=0 Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.280614 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerDied","Data":"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.280661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3779c0f5-9084-4c07-83d9-fe2017559f7b","Type":"ContainerDied","Data":"c73f7e4d7f0f4588b80903c0c3810420cc3aeed26ba2c6224b092ad58bda611c"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.280734 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.283901 4816 generic.go:334] "Generic (PLEG): container finished" podID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" exitCode=0 Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.283934 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc","Type":"ContainerDied","Data":"4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2"} Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.317814 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.318933 4816 scope.go:117] "RemoveContainer" containerID="47287b2bd213321105c729d451b069f02c0e309af3b5c9c84b7b9c24acc1a5f3" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.321369 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.325525 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data" (OuterVolumeSpecName: "config-data") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.326997 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.333923 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.334113 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.345436 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.345737 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.345766 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.350553 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf" (OuterVolumeSpecName: "server-conf") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.359685 4816 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3779c0f5-9084-4c07-83d9-fe2017559f7b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.359872 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvr95\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-kube-api-access-mvr95\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.359950 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360065 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360194 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360291 4816 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360379 4816 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360451 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360521 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3779c0f5-9084-4c07-83d9-fe2017559f7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.360601 4816 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3779c0f5-9084-4c07-83d9-fe2017559f7b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.370977 4816 scope.go:117] "RemoveContainer" containerID="40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.371132 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.371181 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.378214 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.381573 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.393397 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5d6ddcd789-qjf9c"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.396936 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3779c0f5-9084-4c07-83d9-fe2017559f7b" (UID: "3779c0f5-9084-4c07-83d9-fe2017559f7b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.437773 4816 scope.go:117] "RemoveContainer" containerID="40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.439149 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8\": container with ID starting with 40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8 not found: ID does not exist" containerID="40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.439457 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8"} err="failed to get container status \"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8\": rpc error: code = NotFound desc = could not find container \"40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8\": container with ID starting with 40d439392c989a322c37ef2903e2b84825cbedf2d8b6499b35bfc3bb665a65b8 not found: ID does not exist" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.439491 4816 scope.go:117] "RemoveContainer" containerID="18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.463043 4816 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3779c0f5-9084-4c07-83d9-fe2017559f7b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.463096 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.465374 4816 scope.go:117] "RemoveContainer" containerID="522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.487977 4816 scope.go:117] "RemoveContainer" containerID="18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.488729 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f\": container with ID starting with 18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f not found: ID does not exist" containerID="18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.488766 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f"} err="failed to get container status \"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f\": rpc error: code = NotFound desc = could not find container \"18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f\": container with ID starting with 18565c99cadc85b3c1924a92e447c85ed3ed29fe96a7b6c6961caaecc2e1cf9f not found: ID does not exist" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.488802 4816 scope.go:117] "RemoveContainer" containerID="522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e" Mar 11 12:22:29 crc kubenswrapper[4816]: E0311 12:22:29.489165 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e\": container with ID starting with 522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e not found: ID does not exist" containerID="522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.489190 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e"} err="failed to get container status \"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e\": rpc error: code = NotFound desc = could not find container \"522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e\": container with ID starting with 522cea9d64bd20f40ebb73c1f30df7c2a7a511a9ee7536ce5452bc061096e21e not found: ID does not exist" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.617537 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.624363 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.631975 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.665994 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") pod \"41f4b502-b85f-488c-b55b-27a31479df68\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.666468 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") pod \"41f4b502-b85f-488c-b55b-27a31479df68\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.666499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") pod \"41f4b502-b85f-488c-b55b-27a31479df68\" (UID: \"41f4b502-b85f-488c-b55b-27a31479df68\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.669970 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.673232 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x" (OuterVolumeSpecName: "kube-api-access-cvr8x") pod "41f4b502-b85f-488c-b55b-27a31479df68" (UID: "41f4b502-b85f-488c-b55b-27a31479df68"). InnerVolumeSpecName "kube-api-access-cvr8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.703901 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data" (OuterVolumeSpecName: "config-data") pod "41f4b502-b85f-488c-b55b-27a31479df68" (UID: "41f4b502-b85f-488c-b55b-27a31479df68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.706859 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41f4b502-b85f-488c-b55b-27a31479df68" (UID: "41f4b502-b85f-488c-b55b-27a31479df68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768015 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") pod \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768072 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") pod \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768148 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") pod \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\" (UID: \"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc\") " Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768527 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768554 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvr8x\" (UniqueName: \"kubernetes.io/projected/41f4b502-b85f-488c-b55b-27a31479df68-kube-api-access-cvr8x\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.768567 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f4b502-b85f-488c-b55b-27a31479df68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.771958 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65" (OuterVolumeSpecName: "kube-api-access-4ml65") pod "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" (UID: "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc"). InnerVolumeSpecName "kube-api-access-4ml65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.794564 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" (UID: "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.794878 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data" (OuterVolumeSpecName: "config-data") pod "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" (UID: "f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.870468 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.870516 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ml65\" (UniqueName: \"kubernetes.io/projected/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-kube-api-access-4ml65\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:29 crc kubenswrapper[4816]: I0311 12:22:29.870531 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.012788 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.200:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.141156 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26aea2df-f497-478d-b953-060189ef2569" path="/var/lib/kubelet/pods/26aea2df-f497-478d-b953-060189ef2569/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.142052 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" path="/var/lib/kubelet/pods/3779c0f5-9084-4c07-83d9-fe2017559f7b/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.143165 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5030028c-f574-4334-a837-2430761524b4" path="/var/lib/kubelet/pods/5030028c-f574-4334-a837-2430761524b4/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.143697 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" path="/var/lib/kubelet/pods/63567eba-cc2a-4168-9e81-51c1daed5482/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.144276 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" path="/var/lib/kubelet/pods/9c180505-72c6-498d-bfa5-05f689692bd2/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.145567 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" path="/var/lib/kubelet/pods/c71feeeb-a44d-42ec-a4c7-ddbf9a76f825/volumes" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.293874 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"41f4b502-b85f-488c-b55b-27a31479df68","Type":"ContainerDied","Data":"0b682ef5a1cffec39d115886fe70f340b2f710836dc8fb73d7380c331ca3d440"} Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.293939 4816 scope.go:117] "RemoveContainer" containerID="60b94a07b73cb13c7f413f3784714ffd08edfbf819bae0acb651dd949e911744" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.293996 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.305320 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc","Type":"ContainerDied","Data":"897a415294b966ad7eb32e075c662fce4ade523bc49b487efdfde948eb76f843"} Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.305419 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.320936 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.323348 4816 scope.go:117] "RemoveContainer" containerID="4e6b0cc9909a80ea9f6820967069b2707e5bf48017858f5840e01461de16f0c2" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.325946 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.337957 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.346907 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.763415 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-64b59f8d4-2vxd9" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 12:22:30 crc kubenswrapper[4816]: I0311 12:22:30.763567 4816 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-64b59f8d4-2vxd9" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: i/o timeout" Mar 11 12:22:32 crc kubenswrapper[4816]: I0311 12:22:32.142317 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f4b502-b85f-488c-b55b-27a31479df68" path="/var/lib/kubelet/pods/41f4b502-b85f-488c-b55b-27a31479df68/volumes" Mar 11 12:22:32 crc kubenswrapper[4816]: I0311 12:22:32.143359 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" path="/var/lib/kubelet/pods/f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc/volumes" Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.323565 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.324027 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.324237 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.324286 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.325459 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.326585 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.327536 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:34 crc kubenswrapper[4816]: E0311 12:22:34.327574 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.274099 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413186 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413567 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413615 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413641 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413689 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413834 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.413913 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") pod \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\" (UID: \"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46\") " Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416440 4816 generic.go:334] "Generic (PLEG): container finished" podID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerID="fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" exitCode=0 Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416489 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerDied","Data":"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035"} Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416524 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6867c6dbc5-lzgfd" event={"ID":"e6833f8f-2414-42cd-b7c2-4d4a70fd8d46","Type":"ContainerDied","Data":"10129169327e9c40582f9c635a8d87b021f99cc78ac017f7e4f16f40942456bc"} Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416558 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6867c6dbc5-lzgfd" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.416586 4816 scope.go:117] "RemoveContainer" containerID="d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.423659 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9" (OuterVolumeSpecName: "kube-api-access-x7fr9") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "kube-api-access-x7fr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.436708 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.459998 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.464657 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.478017 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.479178 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.482340 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config" (OuterVolumeSpecName: "config") pod "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" (UID: "e6833f8f-2414-42cd-b7c2-4d4a70fd8d46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517227 4816 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517286 4816 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517296 4816 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517307 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517320 4816 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517331 4816 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-config\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.517341 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7fr9\" (UniqueName: \"kubernetes.io/projected/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46-kube-api-access-x7fr9\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.540522 4816 scope.go:117] "RemoveContainer" containerID="fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.564056 4816 scope.go:117] "RemoveContainer" containerID="d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" Mar 11 12:22:37 crc kubenswrapper[4816]: E0311 12:22:37.564756 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d\": container with ID starting with d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d not found: ID does not exist" containerID="d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.564808 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d"} err="failed to get container status \"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d\": rpc error: code = NotFound desc = could not find container \"d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d\": container with ID starting with d70e65be881ec74becf9f1d8a8c457e2fd9c5cbaed1d9869af0f09ff05b4fe7d not found: ID does not exist" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.564844 4816 scope.go:117] "RemoveContainer" containerID="fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" Mar 11 12:22:37 crc kubenswrapper[4816]: E0311 12:22:37.565397 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035\": container with ID starting with fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035 not found: ID does not exist" containerID="fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.565480 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035"} err="failed to get container status \"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035\": rpc error: code = NotFound desc = could not find container \"fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035\": container with ID starting with fc6e871db4cf3ccf1c16a2df0831b957437d80b5ab1f40dfb74553759defd035 not found: ID does not exist" Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.747680 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:22:37 crc kubenswrapper[4816]: I0311 12:22:37.757126 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6867c6dbc5-lzgfd"] Mar 11 12:22:38 crc kubenswrapper[4816]: I0311 12:22:38.146368 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" path="/var/lib/kubelet/pods/e6833f8f-2414-42cd-b7c2-4d4a70fd8d46/volumes" Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.324956 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.325863 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.326002 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.326080 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.326105 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.337329 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.340094 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:39 crc kubenswrapper[4816]: E0311 12:22:39.340142 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.324200 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.325495 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.326162 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.326241 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.326512 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.328910 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.331578 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:44 crc kubenswrapper[4816]: E0311 12:22:44.331649 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:44 crc kubenswrapper[4816]: I0311 12:22:44.419798 4816 scope.go:117] "RemoveContainer" containerID="5d6df61e0b509a66b3346da65b74fba3a74851e8e005a57c5d0fba5a7957a438" Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.323060 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.324407 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.324407 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.324974 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.325038 4816 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.326693 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.328224 4816 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 11 12:22:49 crc kubenswrapper[4816]: E0311 12:22:49.328404 4816 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tnhfq" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.524648 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tnhfq_edc01aa4-013d-4d10-9f22-e5f319e6c1a3/ovs-vswitchd/0.log" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.526508 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.566773 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567231 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.566897 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log" (OuterVolumeSpecName: "var-log") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567344 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567406 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567442 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567513 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") pod \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\" (UID: \"edc01aa4-013d-4d10-9f22-e5f319e6c1a3\") " Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.567950 4816 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-log\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.568007 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run" (OuterVolumeSpecName: "var-run") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.568048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.568082 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib" (OuterVolumeSpecName: "var-lib") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.568523 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts" (OuterVolumeSpecName: "scripts") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.575917 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685" (OuterVolumeSpecName: "kube-api-access-z7685") pod "edc01aa4-013d-4d10-9f22-e5f319e6c1a3" (UID: "edc01aa4-013d-4d10-9f22-e5f319e6c1a3"). InnerVolumeSpecName "kube-api-access-z7685". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.585549 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tnhfq_edc01aa4-013d-4d10-9f22-e5f319e6c1a3/ovs-vswitchd/0.log" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.586707 4816 generic.go:334] "Generic (PLEG): container finished" podID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" exitCode=137 Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.586766 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerDied","Data":"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005"} Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.586806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tnhfq" event={"ID":"edc01aa4-013d-4d10-9f22-e5f319e6c1a3","Type":"ContainerDied","Data":"22c727583d6de2eec899c37134713c754f06d9d2f697ad226095e328238d230b"} Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.586833 4816 scope.go:117] "RemoveContainer" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.587036 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tnhfq" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.623791 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.626738 4816 scope.go:117] "RemoveContainer" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.629663 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tnhfq"] Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.646046 4816 scope.go:117] "RemoveContainer" containerID="ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669858 4816 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-lib\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669900 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669912 4816 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669926 4816 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.669938 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7685\" (UniqueName: \"kubernetes.io/projected/edc01aa4-013d-4d10-9f22-e5f319e6c1a3-kube-api-access-z7685\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.687223 4816 scope.go:117] "RemoveContainer" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" Mar 11 12:22:50 crc kubenswrapper[4816]: E0311 12:22:50.687864 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005\": container with ID starting with 9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005 not found: ID does not exist" containerID="9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.687931 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005"} err="failed to get container status \"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005\": rpc error: code = NotFound desc = could not find container \"9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005\": container with ID starting with 9a502cdadbe9ccdd4397f8d7b5976f7b8a5bbe2117d028536e6c60520f500005 not found: ID does not exist" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.687961 4816 scope.go:117] "RemoveContainer" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" Mar 11 12:22:50 crc kubenswrapper[4816]: E0311 12:22:50.688440 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c\": container with ID starting with e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c not found: ID does not exist" containerID="e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.688496 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c"} err="failed to get container status \"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c\": rpc error: code = NotFound desc = could not find container \"e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c\": container with ID starting with e7be3cc3e488c05059fa7b6a1b844edb89da0e86587c805c18bac6144b80869c not found: ID does not exist" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.688537 4816 scope.go:117] "RemoveContainer" containerID="ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b" Mar 11 12:22:50 crc kubenswrapper[4816]: E0311 12:22:50.688873 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b\": container with ID starting with ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b not found: ID does not exist" containerID="ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b" Mar 11 12:22:50 crc kubenswrapper[4816]: I0311 12:22:50.688909 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b"} err="failed to get container status \"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b\": rpc error: code = NotFound desc = could not find container \"ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b\": container with ID starting with ade1b2e704074d979ee946ca1a74e500865d981040caad349410a4164bba988b not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.229777 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278645 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278759 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278799 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278861 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278887 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.278940 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") pod \"485f9fbd-e0ca-472d-b97c-87c127253a96\" (UID: \"485f9fbd-e0ca-472d-b97c-87c127253a96\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.279486 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache" (OuterVolumeSpecName: "cache") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.279823 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock" (OuterVolumeSpecName: "lock") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.283739 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.285400 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.285609 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r" (OuterVolumeSpecName: "kube-api-access-rbb5r") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "kube-api-access-rbb5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.311432 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.380662 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.380772 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.380820 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.380934 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381064 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381100 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381143 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") pod \"594ad696-b727-4153-979f-d32ccdc1fe83\" (UID: \"594ad696-b727-4153-979f-d32ccdc1fe83\") " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381453 4816 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/594ad696-b727-4153-979f-d32ccdc1fe83-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381473 4816 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381484 4816 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-cache\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381509 4816 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381523 4816 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485f9fbd-e0ca-472d-b97c-87c127253a96-lock\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.381537 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbb5r\" (UniqueName: \"kubernetes.io/projected/485f9fbd-e0ca-472d-b97c-87c127253a96-kube-api-access-rbb5r\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.383883 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.385823 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts" (OuterVolumeSpecName: "scripts") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.386126 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z" (OuterVolumeSpecName: "kube-api-access-bmj7z") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "kube-api-access-bmj7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.395359 4816 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.412292 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.448012 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data" (OuterVolumeSpecName: "config-data") pod "594ad696-b727-4153-979f-d32ccdc1fe83" (UID: "594ad696-b727-4153-979f-d32ccdc1fe83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.482592 4816 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.482995 4816 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.483006 4816 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.483015 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.483038 4816 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594ad696-b727-4153-979f-d32ccdc1fe83-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.483048 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmj7z\" (UniqueName: \"kubernetes.io/projected/594ad696-b727-4153-979f-d32ccdc1fe83-kube-api-access-bmj7z\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.532818 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "485f9fbd-e0ca-472d-b97c-87c127253a96" (UID: "485f9fbd-e0ca-472d-b97c-87c127253a96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.585141 4816 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485f9fbd-e0ca-472d-b97c-87c127253a96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599444 4816 generic.go:334] "Generic (PLEG): container finished" podID="594ad696-b727-4153-979f-d32ccdc1fe83" containerID="b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" exitCode=137 Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599487 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerDied","Data":"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac"} Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599522 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599557 4816 scope.go:117] "RemoveContainer" containerID="4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.599544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"594ad696-b727-4153-979f-d32ccdc1fe83","Type":"ContainerDied","Data":"883c96453eeb3dc398341c2c3b80a740484d91dd773b0fcfe0237a4112b6097a"} Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.607384 4816 generic.go:334] "Generic (PLEG): container finished" podID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerID="4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" exitCode=137 Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.607463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e"} Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.607495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"485f9fbd-e0ca-472d-b97c-87c127253a96","Type":"ContainerDied","Data":"bc1ccba63ef105a914d68b8eed3c206cfda92b47e6236ce5828d528e3ceb9770"} Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.607592 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.626526 4816 scope.go:117] "RemoveContainer" containerID="b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.639583 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.649182 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.655848 4816 scope.go:117] "RemoveContainer" containerID="4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.656821 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8\": container with ID starting with 4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8 not found: ID does not exist" containerID="4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.656867 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8"} err="failed to get container status \"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8\": rpc error: code = NotFound desc = could not find container \"4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8\": container with ID starting with 4e77bdf5f0e95052069948c26d832a542a6227e380d1cfa3a0483957659bccc8 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.656900 4816 scope.go:117] "RemoveContainer" containerID="b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.657131 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac\": container with ID starting with b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac not found: ID does not exist" containerID="b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.657162 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac"} err="failed to get container status \"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac\": rpc error: code = NotFound desc = could not find container \"b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac\": container with ID starting with b969ee005f965c2a4f02537599e354572cbc91b2ebbe38115a382a8ec4f6b2ac not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.657186 4816 scope.go:117] "RemoveContainer" containerID="4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.663212 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.670192 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.676179 4816 scope.go:117] "RemoveContainer" containerID="68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.694095 4816 scope.go:117] "RemoveContainer" containerID="cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.711282 4816 scope.go:117] "RemoveContainer" containerID="1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.729211 4816 scope.go:117] "RemoveContainer" containerID="25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.749427 4816 scope.go:117] "RemoveContainer" containerID="3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.767112 4816 scope.go:117] "RemoveContainer" containerID="fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.785322 4816 scope.go:117] "RemoveContainer" containerID="bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.803367 4816 scope.go:117] "RemoveContainer" containerID="a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.821132 4816 scope.go:117] "RemoveContainer" containerID="424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.848005 4816 scope.go:117] "RemoveContainer" containerID="9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.872492 4816 scope.go:117] "RemoveContainer" containerID="712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.890012 4816 scope.go:117] "RemoveContainer" containerID="acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.909431 4816 scope.go:117] "RemoveContainer" containerID="d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.928698 4816 scope.go:117] "RemoveContainer" containerID="e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.952181 4816 scope.go:117] "RemoveContainer" containerID="4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.953076 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e\": container with ID starting with 4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e not found: ID does not exist" containerID="4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.953144 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e"} err="failed to get container status \"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e\": rpc error: code = NotFound desc = could not find container \"4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e\": container with ID starting with 4ca46893f461e4cae0bfdd754a912325d0a4b5274975f49336f5fe227e8b6f7e not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.953190 4816 scope.go:117] "RemoveContainer" containerID="68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.953579 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2\": container with ID starting with 68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2 not found: ID does not exist" containerID="68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.953613 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2"} err="failed to get container status \"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2\": rpc error: code = NotFound desc = could not find container \"68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2\": container with ID starting with 68827d94971fb4946739508c0c2229d08412fbe98f89ce92ce344232eb5179c2 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.953632 4816 scope.go:117] "RemoveContainer" containerID="cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.954354 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd\": container with ID starting with cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd not found: ID does not exist" containerID="cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.954414 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd"} err="failed to get container status \"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd\": rpc error: code = NotFound desc = could not find container \"cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd\": container with ID starting with cfd9d9bff16dc1b372451554525cfb877c302a88a1df111a3dec64d0abe2d5dd not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.954435 4816 scope.go:117] "RemoveContainer" containerID="1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.954867 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61\": container with ID starting with 1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61 not found: ID does not exist" containerID="1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.954897 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61"} err="failed to get container status \"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61\": rpc error: code = NotFound desc = could not find container \"1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61\": container with ID starting with 1a5cccec1988de28bd2809ac4b5b0048b290948debe0553adf7fdb6a721fdf61 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.954915 4816 scope.go:117] "RemoveContainer" containerID="25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.955408 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477\": container with ID starting with 25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477 not found: ID does not exist" containerID="25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.955437 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477"} err="failed to get container status \"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477\": rpc error: code = NotFound desc = could not find container \"25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477\": container with ID starting with 25b01234ff673f68a2a7d9f83db659ac9f58778b1b6460a3b9e17bc11c9e8477 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.955456 4816 scope.go:117] "RemoveContainer" containerID="3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.955759 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4\": container with ID starting with 3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4 not found: ID does not exist" containerID="3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.955793 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4"} err="failed to get container status \"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4\": rpc error: code = NotFound desc = could not find container \"3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4\": container with ID starting with 3b5ce1950c94241c7d8db075f74a9e25d16f22897d67828fc597eed2fd2ba2d4 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.955813 4816 scope.go:117] "RemoveContainer" containerID="fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.956071 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db\": container with ID starting with fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db not found: ID does not exist" containerID="fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.956100 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db"} err="failed to get container status \"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db\": rpc error: code = NotFound desc = could not find container \"fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db\": container with ID starting with fbc40b5edb4819684be613e55b321d899bc5b2698e897cf6eda8f15eae8281db not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.956117 4816 scope.go:117] "RemoveContainer" containerID="bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.956739 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190\": container with ID starting with bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190 not found: ID does not exist" containerID="bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.956816 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190"} err="failed to get container status \"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190\": rpc error: code = NotFound desc = could not find container \"bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190\": container with ID starting with bed6744a1fb9636a9fc4ea915948476f2eb984fea2bdb9d698c12e5780346190 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.956879 4816 scope.go:117] "RemoveContainer" containerID="a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.957335 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1\": container with ID starting with a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1 not found: ID does not exist" containerID="a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.957370 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1"} err="failed to get container status \"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1\": rpc error: code = NotFound desc = could not find container \"a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1\": container with ID starting with a0947e58e27e62d7256e48c3a5ba6d36f58462add34b2f1281e8c3da0f4574e1 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.957393 4816 scope.go:117] "RemoveContainer" containerID="424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.957776 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf\": container with ID starting with 424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf not found: ID does not exist" containerID="424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.957826 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf"} err="failed to get container status \"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf\": rpc error: code = NotFound desc = could not find container \"424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf\": container with ID starting with 424b40cca785fdb6cef5ca70bab8c7fb8928ab75e5bb80b8b1faf2c2da22fdaf not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.957863 4816 scope.go:117] "RemoveContainer" containerID="9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.958310 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000\": container with ID starting with 9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000 not found: ID does not exist" containerID="9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958339 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000"} err="failed to get container status \"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000\": rpc error: code = NotFound desc = could not find container \"9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000\": container with ID starting with 9aa4725cabfa8b52948323edfacbce1db8fbe4349baf7e60df04631c4c07e000 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958357 4816 scope.go:117] "RemoveContainer" containerID="712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.958646 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280\": container with ID starting with 712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280 not found: ID does not exist" containerID="712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958678 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280"} err="failed to get container status \"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280\": rpc error: code = NotFound desc = could not find container \"712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280\": container with ID starting with 712d42df455f320b81d9b4c5385e08e78c8fffd9af1f0f1a30be961c52606280 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958699 4816 scope.go:117] "RemoveContainer" containerID="acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.958953 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898\": container with ID starting with acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898 not found: ID does not exist" containerID="acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.958985 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898"} err="failed to get container status \"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898\": rpc error: code = NotFound desc = could not find container \"acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898\": container with ID starting with acad9fd17d268a24643ea62be228693020bbd2da3c63a2bc6d162877b0366898 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.959001 4816 scope.go:117] "RemoveContainer" containerID="d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.959361 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6\": container with ID starting with d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6 not found: ID does not exist" containerID="d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.959390 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6"} err="failed to get container status \"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6\": rpc error: code = NotFound desc = could not find container \"d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6\": container with ID starting with d0ccfde3e8badc0e6b92993021ad07fe9ae8e33939c137e6eb3bcf22e04b1ea6 not found: ID does not exist" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.959413 4816 scope.go:117] "RemoveContainer" containerID="e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" Mar 11 12:22:51 crc kubenswrapper[4816]: E0311 12:22:51.959816 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d\": container with ID starting with e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d not found: ID does not exist" containerID="e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d" Mar 11 12:22:51 crc kubenswrapper[4816]: I0311 12:22:51.959845 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d"} err="failed to get container status \"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d\": rpc error: code = NotFound desc = could not find container \"e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d\": container with ID starting with e84af5bcfa14831e3963b52fe73c49f1f89ea652b5b69cd65dfb4008756c4c2d not found: ID does not exist" Mar 11 12:22:52 crc kubenswrapper[4816]: I0311 12:22:52.148223 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" path="/var/lib/kubelet/pods/485f9fbd-e0ca-472d-b97c-87c127253a96/volumes" Mar 11 12:22:52 crc kubenswrapper[4816]: I0311 12:22:52.151309 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" path="/var/lib/kubelet/pods/594ad696-b727-4153-979f-d32ccdc1fe83/volumes" Mar 11 12:22:52 crc kubenswrapper[4816]: I0311 12:22:52.152211 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" path="/var/lib/kubelet/pods/edc01aa4-013d-4d10-9f22-e5f319e6c1a3/volumes" Mar 11 12:22:58 crc kubenswrapper[4816]: I0311 12:22:58.350406 4816 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podda177cde-6332-4562-809a-d4bee453cebf"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podda177cde-6332-4562-809a-d4bee453cebf] : Timed out while waiting for systemd to remove kubepods-besteffort-podda177cde_6332_4562_809a_d4bee453cebf.slice" Mar 11 12:22:58 crc kubenswrapper[4816]: E0311 12:22:58.351159 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podda177cde-6332-4562-809a-d4bee453cebf] : unable to destroy cgroup paths for cgroup [kubepods besteffort podda177cde-6332-4562-809a-d4bee453cebf] : Timed out while waiting for systemd to remove kubepods-besteffort-podda177cde_6332_4562_809a_d4bee453cebf.slice" pod="openstack/openstack-galera-0" podUID="da177cde-6332-4562-809a-d4bee453cebf" Mar 11 12:22:58 crc kubenswrapper[4816]: I0311 12:22:58.701440 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 12:22:58 crc kubenswrapper[4816]: I0311 12:22:58.754076 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:22:58 crc kubenswrapper[4816]: I0311 12:22:58.759929 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 12:23:00 crc kubenswrapper[4816]: I0311 12:23:00.139907 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da177cde-6332-4562-809a-d4bee453cebf" path="/var/lib/kubelet/pods/da177cde-6332-4562-809a-d4bee453cebf/volumes" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.200121 4816 scope.go:117] "RemoveContainer" containerID="0b4c4c1c298f57878044bac49cc49a719acfc3a0f87a1803c19c539d85446637" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.228091 4816 scope.go:117] "RemoveContainer" containerID="8c240088bce92d648a44cfc826778c591f6601fbc70cdbc9325a1348704e1a92" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.250908 4816 scope.go:117] "RemoveContainer" containerID="c460fb14090c9d550203cf386e04b06e3563514702df072e42ad5fc80f7e1872" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.277393 4816 scope.go:117] "RemoveContainer" containerID="0de73c3da519dc3d23fdd410a58406f0ff5aec8f4b5e6483b5c4a546f3b60ef0" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.307116 4816 scope.go:117] "RemoveContainer" containerID="c9628d19e1e9c78361e9677b8afa40ad86295ad47aa0110e4b51ead3233c90ca" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.349655 4816 scope.go:117] "RemoveContainer" containerID="a8f8ba02ac608528a8da635158a48ff55377bd4734bbd746e513b637d5d907d3" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.386647 4816 scope.go:117] "RemoveContainer" containerID="d63f60636f6e53982a24004e405c34ed67500a9193f04b98e8d29856c8e89ee2" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.419309 4816 scope.go:117] "RemoveContainer" containerID="3dfe4dd28e66c33830345db1226180f842f3ae3d59f4fa3a4c553af39dd07c67" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.439218 4816 scope.go:117] "RemoveContainer" containerID="5140353c5c6034db1623dc2f3c189d72ec962703a0a91d22d2e279ead073afac" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.459670 4816 scope.go:117] "RemoveContainer" containerID="ab6525891e160f8b83901124157238a30564c85220f9440c25fb3222634839c7" Mar 11 12:23:45 crc kubenswrapper[4816]: I0311 12:23:45.483207 4816 scope.go:117] "RemoveContainer" containerID="d6e7d3be2f695e55ef6abf84c83d060683eae93e0020c12fe8744829cbcc1d6a" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.387468 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388424 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388442 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388470 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388477 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388489 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="openstack-network-exporter" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388499 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="openstack-network-exporter" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388506 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388513 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388523 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388532 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388546 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="cinder-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388553 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="cinder-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388566 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388574 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388582 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-central-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388588 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-central-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388596 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="sg-core" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388602 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="sg-core" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388615 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388625 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388635 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-reaper" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388642 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-reaper" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388657 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388664 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388675 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388682 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388700 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388707 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388751 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="mysql-bootstrap" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388760 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="mysql-bootstrap" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388772 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="setup-container" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388780 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="setup-container" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388790 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" containerName="keystone-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388799 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" containerName="keystone-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388811 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388821 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388832 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388839 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388852 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388859 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388867 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5030028c-f574-4334-a837-2430761524b4" containerName="memcached" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388874 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5030028c-f574-4334-a837-2430761524b4" containerName="memcached" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388885 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388892 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388901 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388908 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388919 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388926 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388932 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388940 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388952 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388959 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388969 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388976 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.388986 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.388994 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389003 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389012 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389024 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="probe" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389031 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="probe" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389044 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389051 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389063 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="setup-container" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389070 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="setup-container" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389081 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server-init" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389088 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server-init" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389099 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389106 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-server" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389118 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-server" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389138 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="proxy-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389146 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="proxy-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389157 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389164 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389172 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389180 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389189 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389197 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389208 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="mysql-bootstrap" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389216 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="mysql-bootstrap" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389228 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="rsync" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389236 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="rsync" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389267 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389276 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389289 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389296 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389305 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389313 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389328 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389336 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389348 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-expirer" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389356 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-expirer" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389365 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389373 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389387 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389395 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389407 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389415 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389431 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-notification-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389440 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-notification-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389449 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389517 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389538 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389545 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389586 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389596 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389607 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389615 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-server" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389624 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389634 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389645 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389653 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389662 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="swift-recon-cron" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389669 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="swift-recon-cron" Mar 11 12:23:50 crc kubenswrapper[4816]: E0311 12:23:50.389696 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.389705 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390095 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390136 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c180505-72c6-498d-bfa5-05f689692bd2" containerName="keystone-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390175 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-metadata" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390190 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovs-vswitchd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390202 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-expirer" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390215 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d73d9d0-5632-47a3-93e0-899f64f51011" containerName="nova-metadata-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390676 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390702 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390730 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dcc96b-186a-444d-bef3-4c5f117ee652" containerName="kube-state-metrics" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390741 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc01aa4-013d-4d10-9f22-e5f319e6c1a3" containerName="ovsdb-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390752 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="63567eba-cc2a-4168-9e81-51c1daed5482" containerName="nova-cell1-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390767 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390778 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390793 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6833f8f-2414-42cd-b7c2-4d4a70fd8d46" containerName="neutron-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390804 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3779c0f5-9084-4c07-83d9-fe2017559f7b" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390814 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390825 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="cinder-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390839 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="openstack-network-exporter" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390849 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="proxy-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390857 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a22173f-147b-46ac-bb01-596fe9f12b10" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390868 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390876 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390885 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-reaper" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390899 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="da177cde-6332-4562-809a-d4bee453cebf" containerName="galera" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390907 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="594ad696-b727-4153-979f-d32ccdc1fe83" containerName="probe" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390916 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390925 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="rsync" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390934 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390942 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="26aea2df-f497-478d-b953-060189ef2569" containerName="rabbitmq" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390949 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390962 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390973 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-notification-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390982 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.390990 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="swift-recon-cron" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391000 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-auditor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391012 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7795071e-2de0-43cb-b225-cfed54570d94" containerName="barbican-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391021 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7457f2db-7979-4d92-bd90-a1464b8a3878" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391032 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391043 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c94c19c-3ccb-43cc-ab41-92baa3141f73" containerName="cinder-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391055 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="object-updater" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391063 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f4b502-b85f-488c-b55b-27a31479df68" containerName="nova-scheduler-scheduler" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391072 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5030028c-f574-4334-a837-2430761524b4" containerName="memcached" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391083 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391094 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="account-replicator" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391104 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd939d8-3b22-4496-acea-ac527f3e5149" containerName="placement-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391116 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71feeeb-a44d-42ec-a4c7-ddbf9a76f825" containerName="ovn-northd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391125 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9eb0dee-5bdb-4ca4-a746-d33e8b7d20cc" containerName="nova-cell0-conductor-conductor" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391134 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79e89c6-5f56-4439-ad63-a86259d4ed29" containerName="barbican-worker-log" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391142 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="ceilometer-central-agent" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391154 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d28745d2-082d-4c99-90f0-b6c4696fb1a2" containerName="nova-api-api" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391163 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="485f9fbd-e0ca-472d-b97c-87c127253a96" containerName="container-server" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391172 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95ddca0-76d0-4dce-9983-4b07655adc25" containerName="glance-httpd" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.391184 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedb612d-0e22-4025-9151-d0cf7bc4ee42" containerName="sg-core" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.392913 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.403089 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.495746 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.496329 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.496470 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.598394 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.598504 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.598561 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.599007 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.599340 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.619081 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") pod \"community-operators-pzzv2\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:50 crc kubenswrapper[4816]: I0311 12:23:50.716511 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:23:51 crc kubenswrapper[4816]: I0311 12:23:51.245264 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:23:51 crc kubenswrapper[4816]: W0311 12:23:51.248216 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7330c4_f58c_4f88_b3bb_57a9330ff446.slice/crio-43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221 WatchSource:0}: Error finding container 43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221: Status 404 returned error can't find the container with id 43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221 Mar 11 12:23:52 crc kubenswrapper[4816]: I0311 12:23:52.180161 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerID="dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47" exitCode=0 Mar 11 12:23:52 crc kubenswrapper[4816]: I0311 12:23:52.180288 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerDied","Data":"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47"} Mar 11 12:23:52 crc kubenswrapper[4816]: I0311 12:23:52.180649 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerStarted","Data":"43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221"} Mar 11 12:23:54 crc kubenswrapper[4816]: I0311 12:23:54.202660 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerID="0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4" exitCode=0 Mar 11 12:23:54 crc kubenswrapper[4816]: I0311 12:23:54.203064 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerDied","Data":"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4"} Mar 11 12:23:55 crc kubenswrapper[4816]: I0311 12:23:55.214882 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerStarted","Data":"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b"} Mar 11 12:23:55 crc kubenswrapper[4816]: I0311 12:23:55.238312 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzzv2" podStartSLOduration=2.7595729159999998 podStartE2EDuration="5.238286644s" podCreationTimestamp="2026-03-11 12:23:50 +0000 UTC" firstStartedPulling="2026-03-11 12:23:52.182111565 +0000 UTC m=+1518.773375532" lastFinishedPulling="2026-03-11 12:23:54.660825293 +0000 UTC m=+1521.252089260" observedRunningTime="2026-03-11 12:23:55.233851866 +0000 UTC m=+1521.825115843" watchObservedRunningTime="2026-03-11 12:23:55.238286644 +0000 UTC m=+1521.829550611" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.145294 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.147125 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.150239 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.151230 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.151926 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.155614 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.257390 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") pod \"auto-csr-approver-29553864-r52vk\" (UID: \"b1c2a96e-0361-49ae-b1d2-795744511b15\") " pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.359362 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") pod \"auto-csr-approver-29553864-r52vk\" (UID: \"b1c2a96e-0361-49ae-b1d2-795744511b15\") " pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.384906 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") pod \"auto-csr-approver-29553864-r52vk\" (UID: \"b1c2a96e-0361-49ae-b1d2-795744511b15\") " pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.482594 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.717037 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.717131 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.768760 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:00 crc kubenswrapper[4816]: I0311 12:24:00.942672 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:24:01 crc kubenswrapper[4816]: I0311 12:24:01.267963 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553864-r52vk" event={"ID":"b1c2a96e-0361-49ae-b1d2-795744511b15","Type":"ContainerStarted","Data":"de6b160ba6a61c023d6fa94cf65dda0513e45282ed69b2420032bf641b5e229b"} Mar 11 12:24:01 crc kubenswrapper[4816]: I0311 12:24:01.320363 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:01 crc kubenswrapper[4816]: I0311 12:24:01.388563 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:24:02 crc kubenswrapper[4816]: I0311 12:24:02.284089 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553864-r52vk" event={"ID":"b1c2a96e-0361-49ae-b1d2-795744511b15","Type":"ContainerStarted","Data":"abd9d62fb0ff8a700d3029ac698637da8b39d38073d7e8f33a437d1d746d66d8"} Mar 11 12:24:02 crc kubenswrapper[4816]: I0311 12:24:02.310331 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553864-r52vk" podStartSLOduration=1.425145175 podStartE2EDuration="2.31030505s" podCreationTimestamp="2026-03-11 12:24:00 +0000 UTC" firstStartedPulling="2026-03-11 12:24:00.949632934 +0000 UTC m=+1527.540896911" lastFinishedPulling="2026-03-11 12:24:01.834792819 +0000 UTC m=+1528.426056786" observedRunningTime="2026-03-11 12:24:02.302348091 +0000 UTC m=+1528.893612068" watchObservedRunningTime="2026-03-11 12:24:02.31030505 +0000 UTC m=+1528.901569037" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.293598 4816 generic.go:334] "Generic (PLEG): container finished" podID="b1c2a96e-0361-49ae-b1d2-795744511b15" containerID="abd9d62fb0ff8a700d3029ac698637da8b39d38073d7e8f33a437d1d746d66d8" exitCode=0 Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.293653 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553864-r52vk" event={"ID":"b1c2a96e-0361-49ae-b1d2-795744511b15","Type":"ContainerDied","Data":"abd9d62fb0ff8a700d3029ac698637da8b39d38073d7e8f33a437d1d746d66d8"} Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.293892 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzzv2" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="registry-server" containerID="cri-o://604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" gracePeriod=2 Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.767645 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.832063 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") pod \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.832199 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") pod \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.832243 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") pod \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\" (UID: \"1f7330c4-f58c-4f88-b3bb-57a9330ff446\") " Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.833577 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities" (OuterVolumeSpecName: "utilities") pod "1f7330c4-f58c-4f88-b3bb-57a9330ff446" (UID: "1f7330c4-f58c-4f88-b3bb-57a9330ff446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.840566 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl" (OuterVolumeSpecName: "kube-api-access-d6jcl") pod "1f7330c4-f58c-4f88-b3bb-57a9330ff446" (UID: "1f7330c4-f58c-4f88-b3bb-57a9330ff446"). InnerVolumeSpecName "kube-api-access-d6jcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.934213 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:24:03 crc kubenswrapper[4816]: I0311 12:24:03.934765 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6jcl\" (UniqueName: \"kubernetes.io/projected/1f7330c4-f58c-4f88-b3bb-57a9330ff446-kube-api-access-d6jcl\") on node \"crc\" DevicePath \"\"" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.303936 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f7330c4-f58c-4f88-b3bb-57a9330ff446" (UID: "1f7330c4-f58c-4f88-b3bb-57a9330ff446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308044 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerID="604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" exitCode=0 Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308122 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerDied","Data":"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b"} Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308202 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzzv2" event={"ID":"1f7330c4-f58c-4f88-b3bb-57a9330ff446","Type":"ContainerDied","Data":"43e44bc48337ed8b379562414b926c9f09f2c6ffa9dd0f3c3749a4ce1d503221"} Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308230 4816 scope.go:117] "RemoveContainer" containerID="604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.308142 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzzv2" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.347854 4816 scope.go:117] "RemoveContainer" containerID="0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.348172 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7330c4-f58c-4f88-b3bb-57a9330ff446-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.357882 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.371230 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzzv2"] Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.387076 4816 scope.go:117] "RemoveContainer" containerID="dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.417940 4816 scope.go:117] "RemoveContainer" containerID="604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" Mar 11 12:24:04 crc kubenswrapper[4816]: E0311 12:24:04.418944 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b\": container with ID starting with 604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b not found: ID does not exist" containerID="604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.418991 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b"} err="failed to get container status \"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b\": rpc error: code = NotFound desc = could not find container \"604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b\": container with ID starting with 604d0a8cddc27e4a447ebf3a8dc36b0eed2b076b129b94d407bc1de3f75a642b not found: ID does not exist" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.419016 4816 scope.go:117] "RemoveContainer" containerID="0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4" Mar 11 12:24:04 crc kubenswrapper[4816]: E0311 12:24:04.419982 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4\": container with ID starting with 0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4 not found: ID does not exist" containerID="0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.420079 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4"} err="failed to get container status \"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4\": rpc error: code = NotFound desc = could not find container \"0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4\": container with ID starting with 0718e1f9a971272036b149ae9c1f20dfd8dbb75e572667bcff0ca5b0fa6574a4 not found: ID does not exist" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.420131 4816 scope.go:117] "RemoveContainer" containerID="dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47" Mar 11 12:24:04 crc kubenswrapper[4816]: E0311 12:24:04.420641 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47\": container with ID starting with dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47 not found: ID does not exist" containerID="dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.420684 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47"} err="failed to get container status \"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47\": rpc error: code = NotFound desc = could not find container \"dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47\": container with ID starting with dc661cd0f3c21648eef4e6515cef8363a39276b7d6c63b205f7150e0f2074c47 not found: ID does not exist" Mar 11 12:24:04 crc kubenswrapper[4816]: E0311 12:24:04.540235 4816 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.691597 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.753728 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") pod \"b1c2a96e-0361-49ae-b1d2-795744511b15\" (UID: \"b1c2a96e-0361-49ae-b1d2-795744511b15\") " Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.759736 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm" (OuterVolumeSpecName: "kube-api-access-wldbm") pod "b1c2a96e-0361-49ae-b1d2-795744511b15" (UID: "b1c2a96e-0361-49ae-b1d2-795744511b15"). InnerVolumeSpecName "kube-api-access-wldbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:24:04 crc kubenswrapper[4816]: I0311 12:24:04.855692 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wldbm\" (UniqueName: \"kubernetes.io/projected/b1c2a96e-0361-49ae-b1d2-795744511b15-kube-api-access-wldbm\") on node \"crc\" DevicePath \"\"" Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.321561 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553864-r52vk" event={"ID":"b1c2a96e-0361-49ae-b1d2-795744511b15","Type":"ContainerDied","Data":"de6b160ba6a61c023d6fa94cf65dda0513e45282ed69b2420032bf641b5e229b"} Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.321614 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de6b160ba6a61c023d6fa94cf65dda0513e45282ed69b2420032bf641b5e229b" Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.321620 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553864-r52vk" Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.379679 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:24:05 crc kubenswrapper[4816]: I0311 12:24:05.389441 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553858-brk44"] Mar 11 12:24:06 crc kubenswrapper[4816]: I0311 12:24:06.142924 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" path="/var/lib/kubelet/pods/1f7330c4-f58c-4f88-b3bb-57a9330ff446/volumes" Mar 11 12:24:06 crc kubenswrapper[4816]: I0311 12:24:06.144145 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d8cc8e-8af3-41b3-bb8c-6e4e10f00193" path="/var/lib/kubelet/pods/99d8cc8e-8af3-41b3-bb8c-6e4e10f00193/volumes" Mar 11 12:24:09 crc kubenswrapper[4816]: I0311 12:24:09.514951 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:24:09 crc kubenswrapper[4816]: I0311 12:24:09.515548 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:24:39 crc kubenswrapper[4816]: I0311 12:24:39.515226 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:24:39 crc kubenswrapper[4816]: I0311 12:24:39.515867 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.711882 4816 scope.go:117] "RemoveContainer" containerID="0d27f73615e32aa404576eea9593c729502e37fe26b5c92717c4bee0b43a98e6" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.764904 4816 scope.go:117] "RemoveContainer" containerID="315146a94731475a01dc83fd91cffc1dc07e3b8364e3b5f9f4c74f1dffcbe0c4" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.808012 4816 scope.go:117] "RemoveContainer" containerID="0800101b998bc39fbc15280d1397d1d43d3447b5f82870b95f5c0e69e60ff601" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.850873 4816 scope.go:117] "RemoveContainer" containerID="ae9a5cdf2df1a6846c30df048ff752db89454e8f6330fe73c2c82145d550960b" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.871283 4816 scope.go:117] "RemoveContainer" containerID="19aee09219637e7a5ab326ab09421619dc187f94f0843e938baaa3e47920a542" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.897931 4816 scope.go:117] "RemoveContainer" containerID="e5f24ad51eefb627e15014c3582b64a13468c820d9ad9ccfa53acd2f0fb30054" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.927812 4816 scope.go:117] "RemoveContainer" containerID="5d5e0febf80ed4282e61f8380eff77836a90544898e5ff129bf2d82dd15449ea" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.948206 4816 scope.go:117] "RemoveContainer" containerID="5f7cdb31826f59ca1238a145635210cb534eb8b83a42327083142e83ef21c961" Mar 11 12:24:45 crc kubenswrapper[4816]: I0311 12:24:45.984657 4816 scope.go:117] "RemoveContainer" containerID="369058cbb8fa5b6fcca641d0c6bacd8fb984decb4576950458c2fae4a2d14692" Mar 11 12:24:46 crc kubenswrapper[4816]: I0311 12:24:46.005318 4816 scope.go:117] "RemoveContainer" containerID="234b74962788658b9515670058c8f55bb2409a552461ddec719b37310c8f7e0d" Mar 11 12:24:46 crc kubenswrapper[4816]: I0311 12:24:46.025714 4816 scope.go:117] "RemoveContainer" containerID="619b2be9e8a3f61b134d163bc3ebb4105259f3d6eadad7ea8f76de2333bbeac4" Mar 11 12:24:46 crc kubenswrapper[4816]: I0311 12:24:46.070559 4816 scope.go:117] "RemoveContainer" containerID="16af7949a711342a3610523f5b8fbb074d336f04c7a6eb010f9128a10368ad76" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.163220 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:24:55 crc kubenswrapper[4816]: E0311 12:24:55.164451 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="extract-utilities" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164468 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="extract-utilities" Mar 11 12:24:55 crc kubenswrapper[4816]: E0311 12:24:55.164487 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="extract-content" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164495 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="extract-content" Mar 11 12:24:55 crc kubenswrapper[4816]: E0311 12:24:55.164507 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c2a96e-0361-49ae-b1d2-795744511b15" containerName="oc" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164513 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c2a96e-0361-49ae-b1d2-795744511b15" containerName="oc" Mar 11 12:24:55 crc kubenswrapper[4816]: E0311 12:24:55.164526 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="registry-server" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164532 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="registry-server" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164672 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7330c4-f58c-4f88-b3bb-57a9330ff446" containerName="registry-server" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.164694 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c2a96e-0361-49ae-b1d2-795744511b15" containerName="oc" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.165807 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.184889 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.268168 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.268307 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.268357 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.369915 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.370021 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.370076 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.370927 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.371241 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.394132 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") pod \"redhat-marketplace-x58xn\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.483981 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:24:55 crc kubenswrapper[4816]: I0311 12:24:55.909916 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:24:56 crc kubenswrapper[4816]: I0311 12:24:56.827642 4816 generic.go:334] "Generic (PLEG): container finished" podID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerID="6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae" exitCode=0 Mar 11 12:24:56 crc kubenswrapper[4816]: I0311 12:24:56.827865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerDied","Data":"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae"} Mar 11 12:24:56 crc kubenswrapper[4816]: I0311 12:24:56.827953 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerStarted","Data":"fc838844e6bf9c2ab8bd6f7866edb82541f296d43517f2a59b699d6ac93eff82"} Mar 11 12:24:57 crc kubenswrapper[4816]: I0311 12:24:57.841319 4816 generic.go:334] "Generic (PLEG): container finished" podID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerID="c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126" exitCode=0 Mar 11 12:24:57 crc kubenswrapper[4816]: I0311 12:24:57.841432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerDied","Data":"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126"} Mar 11 12:24:58 crc kubenswrapper[4816]: I0311 12:24:58.852558 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerStarted","Data":"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a"} Mar 11 12:24:58 crc kubenswrapper[4816]: I0311 12:24:58.875141 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x58xn" podStartSLOduration=2.247204054 podStartE2EDuration="3.875112653s" podCreationTimestamp="2026-03-11 12:24:55 +0000 UTC" firstStartedPulling="2026-03-11 12:24:56.829916448 +0000 UTC m=+1583.421180415" lastFinishedPulling="2026-03-11 12:24:58.457825047 +0000 UTC m=+1585.049089014" observedRunningTime="2026-03-11 12:24:58.873461376 +0000 UTC m=+1585.464725343" watchObservedRunningTime="2026-03-11 12:24:58.875112653 +0000 UTC m=+1585.466376620" Mar 11 12:25:05 crc kubenswrapper[4816]: I0311 12:25:05.484591 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:05 crc kubenswrapper[4816]: I0311 12:25:05.485179 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:05 crc kubenswrapper[4816]: I0311 12:25:05.525745 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:05 crc kubenswrapper[4816]: I0311 12:25:05.964363 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:06 crc kubenswrapper[4816]: I0311 12:25:06.006139 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:25:07 crc kubenswrapper[4816]: I0311 12:25:07.937766 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x58xn" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="registry-server" containerID="cri-o://38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" gracePeriod=2 Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.315337 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.378208 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") pod \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.378282 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") pod \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.378343 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") pod \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\" (UID: \"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d\") " Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.379233 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities" (OuterVolumeSpecName: "utilities") pod "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" (UID: "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.385569 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj" (OuterVolumeSpecName: "kube-api-access-6r4xj") pod "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" (UID: "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d"). InnerVolumeSpecName "kube-api-access-6r4xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.407048 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" (UID: "03d94e6d-ee31-419d-9b0b-3c5ed80aab2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.479817 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r4xj\" (UniqueName: \"kubernetes.io/projected/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-kube-api-access-6r4xj\") on node \"crc\" DevicePath \"\"" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.479853 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.479863 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.951846 4816 generic.go:334] "Generic (PLEG): container finished" podID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerID="38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" exitCode=0 Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.951945 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x58xn" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.951975 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerDied","Data":"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a"} Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.952633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x58xn" event={"ID":"03d94e6d-ee31-419d-9b0b-3c5ed80aab2d","Type":"ContainerDied","Data":"fc838844e6bf9c2ab8bd6f7866edb82541f296d43517f2a59b699d6ac93eff82"} Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.952678 4816 scope.go:117] "RemoveContainer" containerID="38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.976866 4816 scope.go:117] "RemoveContainer" containerID="c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126" Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.986953 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:25:08 crc kubenswrapper[4816]: I0311 12:25:08.993441 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x58xn"] Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.006262 4816 scope.go:117] "RemoveContainer" containerID="6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.032059 4816 scope.go:117] "RemoveContainer" containerID="38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.032720 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a\": container with ID starting with 38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a not found: ID does not exist" containerID="38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.032780 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a"} err="failed to get container status \"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a\": rpc error: code = NotFound desc = could not find container \"38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a\": container with ID starting with 38223cfa50c223cb06d45baf800f8152f2e84f146e2add38b7af689440300a5a not found: ID does not exist" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.032831 4816 scope.go:117] "RemoveContainer" containerID="c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126" Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.033237 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126\": container with ID starting with c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126 not found: ID does not exist" containerID="c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.033327 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126"} err="failed to get container status \"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126\": rpc error: code = NotFound desc = could not find container \"c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126\": container with ID starting with c65fe7d5a20e4b5050c03ee81679e23a451044eb73f992a53724c87240abc126 not found: ID does not exist" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.033370 4816 scope.go:117] "RemoveContainer" containerID="6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae" Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.033846 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae\": container with ID starting with 6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae not found: ID does not exist" containerID="6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.033887 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae"} err="failed to get container status \"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae\": rpc error: code = NotFound desc = could not find container \"6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae\": container with ID starting with 6d3f6f14d76a2d4fc858c260674ca7f6ab190c541311493a84e7844578098fae not found: ID does not exist" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.515316 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.515403 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.515475 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.516280 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.516357 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" gracePeriod=600 Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.639107 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.970439 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" exitCode=0 Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.970495 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3"} Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.970537 4816 scope.go:117] "RemoveContainer" containerID="20e5352a1f18de3da65279dced0572d988bf4c64c45f769d6d0ae47f9c2cef9a" Mar 11 12:25:09 crc kubenswrapper[4816]: I0311 12:25:09.971299 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:25:09 crc kubenswrapper[4816]: E0311 12:25:09.971932 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:25:10 crc kubenswrapper[4816]: I0311 12:25:10.142197 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" path="/var/lib/kubelet/pods/03d94e6d-ee31-419d-9b0b-3c5ed80aab2d/volumes" Mar 11 12:25:24 crc kubenswrapper[4816]: I0311 12:25:24.135299 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:25:24 crc kubenswrapper[4816]: E0311 12:25:24.136232 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:25:38 crc kubenswrapper[4816]: I0311 12:25:38.130648 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:25:38 crc kubenswrapper[4816]: E0311 12:25:38.131583 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.250585 4816 scope.go:117] "RemoveContainer" containerID="c04dc0a2663851eac8a9c1faccfd79cf6c27fbce470c4ad0b7499358caea8a06" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.278160 4816 scope.go:117] "RemoveContainer" containerID="6a15e8693d1f25cf8eeefb7b013bbcd57f9676d5cee6b31111e7f71f5ea2e5ca" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.309873 4816 scope.go:117] "RemoveContainer" containerID="0f5d94dc5d9bb04750c9b3d2e89fcd5a5d6e20ec2f4a19899cb047b2927291a4" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.332035 4816 scope.go:117] "RemoveContainer" containerID="691c4f9d45de04f6bb32f82d9d22154b130edce7e7b8b75479f100df834dbbad" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.364889 4816 scope.go:117] "RemoveContainer" containerID="8dc2306ac32e5d795143d562064f5d8e129c4815490ca1bada6d8509ddcc5240" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.387541 4816 scope.go:117] "RemoveContainer" containerID="fd551dbdb54bb8de807a245da392a1fc03bca9f397e581b66063faafeaf38a5f" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.406339 4816 scope.go:117] "RemoveContainer" containerID="c3956854978860cbc650270e665106bd8e95400d5b8cce00a86ed500eb262922" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.471146 4816 scope.go:117] "RemoveContainer" containerID="5e19f1840cfd8f7623e64404579f814579ee6602ca765f964613a90342b26cc2" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.501502 4816 scope.go:117] "RemoveContainer" containerID="c704df83b8c052d797cc33017726ade79e749840ea39268bdd3404b42194d40d" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.540402 4816 scope.go:117] "RemoveContainer" containerID="1f2178fe24813df8bfcc542c32d18ec7c0d7ab550dc406623e692f0465cd6535" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.574141 4816 scope.go:117] "RemoveContainer" containerID="090174f400ae3d182bc1e17d475eb20c26198249c703a798e2b253812bea946b" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.597057 4816 scope.go:117] "RemoveContainer" containerID="2ba25af6bbe93bf77e8ed2bed1866df9a0d1cdcadbd32ffc70070db8155b1914" Mar 11 12:25:46 crc kubenswrapper[4816]: I0311 12:25:46.620076 4816 scope.go:117] "RemoveContainer" containerID="cbfbf586e19291c8ee373bf860029353b3f56429b7bf9015d736b9982aa4797f" Mar 11 12:25:53 crc kubenswrapper[4816]: I0311 12:25:53.130822 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:25:53 crc kubenswrapper[4816]: E0311 12:25:53.132522 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.165344 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:26:00 crc kubenswrapper[4816]: E0311 12:26:00.166419 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="extract-utilities" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.166437 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="extract-utilities" Mar 11 12:26:00 crc kubenswrapper[4816]: E0311 12:26:00.166460 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="extract-content" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.166468 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="extract-content" Mar 11 12:26:00 crc kubenswrapper[4816]: E0311 12:26:00.166489 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="registry-server" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.166497 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="registry-server" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.166675 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d94e6d-ee31-419d-9b0b-3c5ed80aab2d" containerName="registry-server" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.167367 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.170691 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.170784 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.170980 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.176448 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.288840 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") pod \"auto-csr-approver-29553866-w7rtm\" (UID: \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\") " pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.390628 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") pod \"auto-csr-approver-29553866-w7rtm\" (UID: \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\") " pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.411138 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") pod \"auto-csr-approver-29553866-w7rtm\" (UID: \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\") " pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.504223 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:00 crc kubenswrapper[4816]: I0311 12:26:00.953158 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:26:01 crc kubenswrapper[4816]: I0311 12:26:01.446358 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" event={"ID":"94280e97-4b7e-4a2d-8f1f-c3125f5910bc","Type":"ContainerStarted","Data":"5f08cc228d5680e5632a5bb4aa691b63f252341df6512e0b7902502d897b8a17"} Mar 11 12:26:02 crc kubenswrapper[4816]: I0311 12:26:02.457780 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" event={"ID":"94280e97-4b7e-4a2d-8f1f-c3125f5910bc","Type":"ContainerStarted","Data":"b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043"} Mar 11 12:26:02 crc kubenswrapper[4816]: I0311 12:26:02.483625 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" podStartSLOduration=1.512860462 podStartE2EDuration="2.483600364s" podCreationTimestamp="2026-03-11 12:26:00 +0000 UTC" firstStartedPulling="2026-03-11 12:26:00.963978039 +0000 UTC m=+1647.555242016" lastFinishedPulling="2026-03-11 12:26:01.934717961 +0000 UTC m=+1648.525981918" observedRunningTime="2026-03-11 12:26:02.472572048 +0000 UTC m=+1649.063836025" watchObservedRunningTime="2026-03-11 12:26:02.483600364 +0000 UTC m=+1649.074864321" Mar 11 12:26:02 crc kubenswrapper[4816]: E0311 12:26:02.520492 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94280e97_4b7e_4a2d_8f1f_c3125f5910bc.slice/crio-b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:26:03 crc kubenswrapper[4816]: I0311 12:26:03.468917 4816 generic.go:334] "Generic (PLEG): container finished" podID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" containerID="b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043" exitCode=0 Mar 11 12:26:03 crc kubenswrapper[4816]: I0311 12:26:03.468982 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" event={"ID":"94280e97-4b7e-4a2d-8f1f-c3125f5910bc","Type":"ContainerDied","Data":"b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043"} Mar 11 12:26:04 crc kubenswrapper[4816]: I0311 12:26:04.773023 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:04 crc kubenswrapper[4816]: I0311 12:26:04.967892 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") pod \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\" (UID: \"94280e97-4b7e-4a2d-8f1f-c3125f5910bc\") " Mar 11 12:26:04 crc kubenswrapper[4816]: I0311 12:26:04.979581 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg" (OuterVolumeSpecName: "kube-api-access-wdkfg") pod "94280e97-4b7e-4a2d-8f1f-c3125f5910bc" (UID: "94280e97-4b7e-4a2d-8f1f-c3125f5910bc"). InnerVolumeSpecName "kube-api-access-wdkfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.070332 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdkfg\" (UniqueName: \"kubernetes.io/projected/94280e97-4b7e-4a2d-8f1f-c3125f5910bc-kube-api-access-wdkfg\") on node \"crc\" DevicePath \"\"" Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.486410 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" event={"ID":"94280e97-4b7e-4a2d-8f1f-c3125f5910bc","Type":"ContainerDied","Data":"5f08cc228d5680e5632a5bb4aa691b63f252341df6512e0b7902502d897b8a17"} Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.486470 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f08cc228d5680e5632a5bb4aa691b63f252341df6512e0b7902502d897b8a17" Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.486475 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553866-w7rtm" Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.540763 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:26:05 crc kubenswrapper[4816]: I0311 12:26:05.545987 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553860-9kp4n"] Mar 11 12:26:06 crc kubenswrapper[4816]: I0311 12:26:06.139867 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9" path="/var/lib/kubelet/pods/4cde56f2-9503-4b1c-84bf-8b49f6a9f5e9/volumes" Mar 11 12:26:08 crc kubenswrapper[4816]: I0311 12:26:08.132782 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:26:08 crc kubenswrapper[4816]: E0311 12:26:08.133329 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:26:20 crc kubenswrapper[4816]: I0311 12:26:20.131819 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:26:20 crc kubenswrapper[4816]: E0311 12:26:20.133166 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:26:34 crc kubenswrapper[4816]: I0311 12:26:34.137318 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:26:34 crc kubenswrapper[4816]: E0311 12:26:34.138334 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.802061 4816 scope.go:117] "RemoveContainer" containerID="f424c56cb69a088a064cc5d2e599b7db758b5e10ecf876c1586a8816a4d96acd" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.868096 4816 scope.go:117] "RemoveContainer" containerID="adc85e912176222f128333dea158980c88ef84553f1cf56cb52f64a7b64c83d6" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.890131 4816 scope.go:117] "RemoveContainer" containerID="73c8dd7cd36356a8399521ca85923fa5bea70d1a67253cbf2ad9c716aae771dd" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.911495 4816 scope.go:117] "RemoveContainer" containerID="7499f2a9acd657b210b2f77e2cefe97fa749ba96e868296d76960eaa9ed38ee8" Mar 11 12:26:46 crc kubenswrapper[4816]: I0311 12:26:46.964422 4816 scope.go:117] "RemoveContainer" containerID="7074db26ba14c2f5793b32a499e15ff64a76fc4764f04041e3b7367e813d1eb6" Mar 11 12:26:47 crc kubenswrapper[4816]: I0311 12:26:47.013509 4816 scope.go:117] "RemoveContainer" containerID="393812c2bcecafdfa2f0e8dd848197ee6392877cb2e73b5a2b5b12d642a2ed5c" Mar 11 12:26:47 crc kubenswrapper[4816]: I0311 12:26:47.052815 4816 scope.go:117] "RemoveContainer" containerID="c5bd6af011971fbc8f1597775b66953c97da9697d9d038a8ec30a1201d7a28f6" Mar 11 12:26:49 crc kubenswrapper[4816]: I0311 12:26:49.131830 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:26:49 crc kubenswrapper[4816]: E0311 12:26:49.132137 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:00 crc kubenswrapper[4816]: I0311 12:27:00.130232 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:00 crc kubenswrapper[4816]: E0311 12:27:00.131112 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:13 crc kubenswrapper[4816]: I0311 12:27:13.130365 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:13 crc kubenswrapper[4816]: E0311 12:27:13.131284 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:24 crc kubenswrapper[4816]: I0311 12:27:24.138842 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:24 crc kubenswrapper[4816]: E0311 12:27:24.141606 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:36 crc kubenswrapper[4816]: I0311 12:27:36.130988 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:36 crc kubenswrapper[4816]: E0311 12:27:36.131745 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:27:47 crc kubenswrapper[4816]: I0311 12:27:47.146354 4816 scope.go:117] "RemoveContainer" containerID="f7560f8d6f98f14204afbbce69a7ff86d5f07a2d1a84e68d20701b7c7e5ce84d" Mar 11 12:27:47 crc kubenswrapper[4816]: I0311 12:27:47.171709 4816 scope.go:117] "RemoveContainer" containerID="9fc317ca9311d71a32a61a06236255eddc3a32782036027513c2583e902eb2de" Mar 11 12:27:47 crc kubenswrapper[4816]: I0311 12:27:47.190976 4816 scope.go:117] "RemoveContainer" containerID="adf484d20700d25957189d351eb669acaae4683a20326267761afe30c6a7e50c" Mar 11 12:27:49 crc kubenswrapper[4816]: I0311 12:27:49.131172 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:27:49 crc kubenswrapper[4816]: E0311 12:27:49.131813 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.154665 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:28:00 crc kubenswrapper[4816]: E0311 12:28:00.162775 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" containerName="oc" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.162805 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" containerName="oc" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.162959 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" containerName="oc" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.163529 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.164277 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.165833 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.166043 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.166043 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.275852 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") pod \"auto-csr-approver-29553868-qmhvt\" (UID: \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\") " pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.377574 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") pod \"auto-csr-approver-29553868-qmhvt\" (UID: \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\") " pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.403480 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") pod \"auto-csr-approver-29553868-qmhvt\" (UID: \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\") " pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.489850 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.914637 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:28:00 crc kubenswrapper[4816]: I0311 12:28:00.924226 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:28:01 crc kubenswrapper[4816]: I0311 12:28:01.131126 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:01 crc kubenswrapper[4816]: E0311 12:28:01.131482 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:01 crc kubenswrapper[4816]: I0311 12:28:01.449311 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" event={"ID":"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf","Type":"ContainerStarted","Data":"10969e02780e70dd2f19efe9e840c72d7513c2438e871aab3b04515a76a41e86"} Mar 11 12:28:03 crc kubenswrapper[4816]: I0311 12:28:03.469190 4816 generic.go:334] "Generic (PLEG): container finished" podID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" containerID="6f896b214f33da369f143727ecfdb3b64f134749ec6e50337a2f62ef03d15c62" exitCode=0 Mar 11 12:28:03 crc kubenswrapper[4816]: I0311 12:28:03.469280 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" event={"ID":"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf","Type":"ContainerDied","Data":"6f896b214f33da369f143727ecfdb3b64f134749ec6e50337a2f62ef03d15c62"} Mar 11 12:28:04 crc kubenswrapper[4816]: I0311 12:28:04.839215 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:04 crc kubenswrapper[4816]: I0311 12:28:04.950784 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") pod \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\" (UID: \"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf\") " Mar 11 12:28:04 crc kubenswrapper[4816]: I0311 12:28:04.964478 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp" (OuterVolumeSpecName: "kube-api-access-rqwrp") pod "7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" (UID: "7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf"). InnerVolumeSpecName "kube-api-access-rqwrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.052802 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqwrp\" (UniqueName: \"kubernetes.io/projected/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf-kube-api-access-rqwrp\") on node \"crc\" DevicePath \"\"" Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.489496 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" event={"ID":"7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf","Type":"ContainerDied","Data":"10969e02780e70dd2f19efe9e840c72d7513c2438e871aab3b04515a76a41e86"} Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.489575 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10969e02780e70dd2f19efe9e840c72d7513c2438e871aab3b04515a76a41e86" Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.489674 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553868-qmhvt" Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.916335 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:28:05 crc kubenswrapper[4816]: I0311 12:28:05.921509 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553862-ldg69"] Mar 11 12:28:06 crc kubenswrapper[4816]: I0311 12:28:06.149999 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787da494-4b4f-4a96-9e39-45179c456dc0" path="/var/lib/kubelet/pods/787da494-4b4f-4a96-9e39-45179c456dc0/volumes" Mar 11 12:28:12 crc kubenswrapper[4816]: I0311 12:28:12.131147 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:12 crc kubenswrapper[4816]: E0311 12:28:12.132159 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:26 crc kubenswrapper[4816]: I0311 12:28:26.130370 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:26 crc kubenswrapper[4816]: E0311 12:28:26.131639 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:38 crc kubenswrapper[4816]: I0311 12:28:38.131509 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:38 crc kubenswrapper[4816]: E0311 12:28:38.132537 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:28:47 crc kubenswrapper[4816]: I0311 12:28:47.272902 4816 scope.go:117] "RemoveContainer" containerID="d0874559d26089e67dcd3126789f0cf0dc3ed1323323af96fe7e8ee67fbd532f" Mar 11 12:28:49 crc kubenswrapper[4816]: I0311 12:28:49.133234 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:28:49 crc kubenswrapper[4816]: E0311 12:28:49.134784 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:01 crc kubenswrapper[4816]: I0311 12:29:01.129939 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:01 crc kubenswrapper[4816]: E0311 12:29:01.130785 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:13 crc kubenswrapper[4816]: I0311 12:29:13.130145 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:13 crc kubenswrapper[4816]: E0311 12:29:13.131985 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:25 crc kubenswrapper[4816]: I0311 12:29:25.130678 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:25 crc kubenswrapper[4816]: E0311 12:29:25.132162 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:39 crc kubenswrapper[4816]: I0311 12:29:39.131886 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:39 crc kubenswrapper[4816]: E0311 12:29:39.132728 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:29:50 crc kubenswrapper[4816]: I0311 12:29:50.131341 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:29:50 crc kubenswrapper[4816]: E0311 12:29:50.132291 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.163716 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:30:00 crc kubenswrapper[4816]: E0311 12:30:00.164832 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" containerName="oc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.164855 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" containerName="oc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.165097 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" containerName="oc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.165959 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.169336 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.170122 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.170417 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.170707 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.179156 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.180411 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.190046 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.190063 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.194574 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.330715 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.330767 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") pod \"auto-csr-approver-29553870-5ndrc\" (UID: \"2920f68a-c5bb-474c-929b-09ced109bcc0\") " pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.331053 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.331324 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.432515 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.432935 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.432965 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") pod \"auto-csr-approver-29553870-5ndrc\" (UID: \"2920f68a-c5bb-474c-929b-09ced109bcc0\") " pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.433094 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.434083 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.440131 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.449360 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") pod \"collect-profiles-29553870-n6l8j\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.451115 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") pod \"auto-csr-approver-29553870-5ndrc\" (UID: \"2920f68a-c5bb-474c-929b-09ced109bcc0\") " pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.499029 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.515581 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.921769 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 12:30:00 crc kubenswrapper[4816]: I0311 12:30:00.967799 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:30:00 crc kubenswrapper[4816]: W0311 12:30:00.968169 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2920f68a_c5bb_474c_929b_09ced109bcc0.slice/crio-507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2 WatchSource:0}: Error finding container 507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2: Status 404 returned error can't find the container with id 507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2 Mar 11 12:30:01 crc kubenswrapper[4816]: I0311 12:30:01.533666 4816 generic.go:334] "Generic (PLEG): container finished" podID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" containerID="df1d35d17e400d5b7e626c6af7307f8e5a96cbf6b1e197b1b3bcbb3209f59864" exitCode=0 Mar 11 12:30:01 crc kubenswrapper[4816]: I0311 12:30:01.533765 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" event={"ID":"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81","Type":"ContainerDied","Data":"df1d35d17e400d5b7e626c6af7307f8e5a96cbf6b1e197b1b3bcbb3209f59864"} Mar 11 12:30:01 crc kubenswrapper[4816]: I0311 12:30:01.533803 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" event={"ID":"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81","Type":"ContainerStarted","Data":"0ed1cd926ec9b86826d9073d68146148020654ae60ad790cb1be73478f1918d9"} Mar 11 12:30:01 crc kubenswrapper[4816]: I0311 12:30:01.535559 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" event={"ID":"2920f68a-c5bb-474c-929b-09ced109bcc0","Type":"ContainerStarted","Data":"507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2"} Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.819378 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.873131 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") pod \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.873225 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") pod \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.873341 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") pod \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\" (UID: \"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81\") " Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.874115 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume" (OuterVolumeSpecName: "config-volume") pod "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" (UID: "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.881054 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" (UID: "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.881087 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5" (OuterVolumeSpecName: "kube-api-access-j4zb5") pod "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" (UID: "af7ed2ea-fc1c-4a1a-bf16-50a9817aac81"). InnerVolumeSpecName "kube-api-access-j4zb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.974632 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.974896 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4zb5\" (UniqueName: \"kubernetes.io/projected/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-kube-api-access-j4zb5\") on node \"crc\" DevicePath \"\"" Mar 11 12:30:02 crc kubenswrapper[4816]: I0311 12:30:02.974957 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.551853 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" event={"ID":"af7ed2ea-fc1c-4a1a-bf16-50a9817aac81","Type":"ContainerDied","Data":"0ed1cd926ec9b86826d9073d68146148020654ae60ad790cb1be73478f1918d9"} Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.552624 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed1cd926ec9b86826d9073d68146148020654ae60ad790cb1be73478f1918d9" Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.551894 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j" Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.553629 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" event={"ID":"2920f68a-c5bb-474c-929b-09ced109bcc0","Type":"ContainerStarted","Data":"921a40cedc2e53aafc505227f80b7caf98ac145dd8c9d234ce2c285d7eb65e19"} Mar 11 12:30:03 crc kubenswrapper[4816]: I0311 12:30:03.863949 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" podStartSLOduration=1.557733046 podStartE2EDuration="3.86390744s" podCreationTimestamp="2026-03-11 12:30:00 +0000 UTC" firstStartedPulling="2026-03-11 12:30:00.970944777 +0000 UTC m=+1887.562208744" lastFinishedPulling="2026-03-11 12:30:03.277119171 +0000 UTC m=+1889.868383138" observedRunningTime="2026-03-11 12:30:03.573684231 +0000 UTC m=+1890.164948218" watchObservedRunningTime="2026-03-11 12:30:03.86390744 +0000 UTC m=+1890.455171407" Mar 11 12:30:04 crc kubenswrapper[4816]: I0311 12:30:04.563156 4816 generic.go:334] "Generic (PLEG): container finished" podID="2920f68a-c5bb-474c-929b-09ced109bcc0" containerID="921a40cedc2e53aafc505227f80b7caf98ac145dd8c9d234ce2c285d7eb65e19" exitCode=0 Mar 11 12:30:04 crc kubenswrapper[4816]: I0311 12:30:04.563224 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" event={"ID":"2920f68a-c5bb-474c-929b-09ced109bcc0","Type":"ContainerDied","Data":"921a40cedc2e53aafc505227f80b7caf98ac145dd8c9d234ce2c285d7eb65e19"} Mar 11 12:30:05 crc kubenswrapper[4816]: I0311 12:30:05.131092 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:30:05 crc kubenswrapper[4816]: E0311 12:30:05.131819 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:30:05 crc kubenswrapper[4816]: I0311 12:30:05.828515 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.021701 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") pod \"2920f68a-c5bb-474c-929b-09ced109bcc0\" (UID: \"2920f68a-c5bb-474c-929b-09ced109bcc0\") " Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.028199 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv" (OuterVolumeSpecName: "kube-api-access-xxwjv") pod "2920f68a-c5bb-474c-929b-09ced109bcc0" (UID: "2920f68a-c5bb-474c-929b-09ced109bcc0"). InnerVolumeSpecName "kube-api-access-xxwjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.124327 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxwjv\" (UniqueName: \"kubernetes.io/projected/2920f68a-c5bb-474c-929b-09ced109bcc0-kube-api-access-xxwjv\") on node \"crc\" DevicePath \"\"" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.581639 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" event={"ID":"2920f68a-c5bb-474c-929b-09ced109bcc0","Type":"ContainerDied","Data":"507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2"} Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.581774 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507b3aeab6926360f8ab859be408d524db892bb97d8bacd4ffcbc7166864a7b2" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.581727 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553870-5ndrc" Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.629678 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:30:06 crc kubenswrapper[4816]: I0311 12:30:06.634905 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553864-r52vk"] Mar 11 12:30:08 crc kubenswrapper[4816]: I0311 12:30:08.151539 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c2a96e-0361-49ae-b1d2-795744511b15" path="/var/lib/kubelet/pods/b1c2a96e-0361-49ae-b1d2-795744511b15/volumes" Mar 11 12:30:18 crc kubenswrapper[4816]: I0311 12:30:18.131232 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:30:18 crc kubenswrapper[4816]: I0311 12:30:18.682071 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9"} Mar 11 12:30:47 crc kubenswrapper[4816]: I0311 12:30:47.349452 4816 scope.go:117] "RemoveContainer" containerID="abd9d62fb0ff8a700d3029ac698637da8b39d38073d7e8f33a437d1d746d66d8" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.147482 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:32:00 crc kubenswrapper[4816]: E0311 12:32:00.148529 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2920f68a-c5bb-474c-929b-09ced109bcc0" containerName="oc" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.148549 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2920f68a-c5bb-474c-929b-09ced109bcc0" containerName="oc" Mar 11 12:32:00 crc kubenswrapper[4816]: E0311 12:32:00.148567 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" containerName="collect-profiles" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.148575 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" containerName="collect-profiles" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.148745 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" containerName="collect-profiles" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.148763 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2920f68a-c5bb-474c-929b-09ced109bcc0" containerName="oc" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.149520 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.152106 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.152385 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.152608 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.158163 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.235007 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") pod \"auto-csr-approver-29553872-r7v8g\" (UID: \"9ea5145c-0d08-4c85-984a-84c7e0820999\") " pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.336354 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") pod \"auto-csr-approver-29553872-r7v8g\" (UID: \"9ea5145c-0d08-4c85-984a-84c7e0820999\") " pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.358158 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") pod \"auto-csr-approver-29553872-r7v8g\" (UID: \"9ea5145c-0d08-4c85-984a-84c7e0820999\") " pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:00 crc kubenswrapper[4816]: I0311 12:32:00.468317 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:01 crc kubenswrapper[4816]: I0311 12:32:01.015106 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:32:01 crc kubenswrapper[4816]: I0311 12:32:01.580450 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" event={"ID":"9ea5145c-0d08-4c85-984a-84c7e0820999","Type":"ContainerStarted","Data":"132c71556380e7ecbf9e9d5efbb6e1f1c231855b72c0b3699d9d7dc11b983ca1"} Mar 11 12:32:02 crc kubenswrapper[4816]: I0311 12:32:02.588726 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" event={"ID":"9ea5145c-0d08-4c85-984a-84c7e0820999","Type":"ContainerStarted","Data":"2595726d36e0a1d13282e231fa75cc95c6cd459575385b61b5632d40de6eac9f"} Mar 11 12:32:02 crc kubenswrapper[4816]: I0311 12:32:02.608807 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" podStartSLOduration=1.357092295 podStartE2EDuration="2.60878226s" podCreationTimestamp="2026-03-11 12:32:00 +0000 UTC" firstStartedPulling="2026-03-11 12:32:01.027187349 +0000 UTC m=+2007.618451316" lastFinishedPulling="2026-03-11 12:32:02.278877304 +0000 UTC m=+2008.870141281" observedRunningTime="2026-03-11 12:32:02.601338732 +0000 UTC m=+2009.192602719" watchObservedRunningTime="2026-03-11 12:32:02.60878226 +0000 UTC m=+2009.200046227" Mar 11 12:32:03 crc kubenswrapper[4816]: I0311 12:32:03.602999 4816 generic.go:334] "Generic (PLEG): container finished" podID="9ea5145c-0d08-4c85-984a-84c7e0820999" containerID="2595726d36e0a1d13282e231fa75cc95c6cd459575385b61b5632d40de6eac9f" exitCode=0 Mar 11 12:32:03 crc kubenswrapper[4816]: I0311 12:32:03.603082 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" event={"ID":"9ea5145c-0d08-4c85-984a-84c7e0820999","Type":"ContainerDied","Data":"2595726d36e0a1d13282e231fa75cc95c6cd459575385b61b5632d40de6eac9f"} Mar 11 12:32:04 crc kubenswrapper[4816]: I0311 12:32:04.925809 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.109381 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") pod \"9ea5145c-0d08-4c85-984a-84c7e0820999\" (UID: \"9ea5145c-0d08-4c85-984a-84c7e0820999\") " Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.118385 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg" (OuterVolumeSpecName: "kube-api-access-mtkgg") pod "9ea5145c-0d08-4c85-984a-84c7e0820999" (UID: "9ea5145c-0d08-4c85-984a-84c7e0820999"). InnerVolumeSpecName "kube-api-access-mtkgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.211315 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtkgg\" (UniqueName: \"kubernetes.io/projected/9ea5145c-0d08-4c85-984a-84c7e0820999-kube-api-access-mtkgg\") on node \"crc\" DevicePath \"\"" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.621726 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" event={"ID":"9ea5145c-0d08-4c85-984a-84c7e0820999","Type":"ContainerDied","Data":"132c71556380e7ecbf9e9d5efbb6e1f1c231855b72c0b3699d9d7dc11b983ca1"} Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.621778 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553872-r7v8g" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.621790 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132c71556380e7ecbf9e9d5efbb6e1f1c231855b72c0b3699d9d7dc11b983ca1" Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.686696 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:32:05 crc kubenswrapper[4816]: I0311 12:32:05.691617 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553866-w7rtm"] Mar 11 12:32:06 crc kubenswrapper[4816]: I0311 12:32:06.141140 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94280e97-4b7e-4a2d-8f1f-c3125f5910bc" path="/var/lib/kubelet/pods/94280e97-4b7e-4a2d-8f1f-c3125f5910bc/volumes" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.726736 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:37 crc kubenswrapper[4816]: E0311 12:32:37.728285 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea5145c-0d08-4c85-984a-84c7e0820999" containerName="oc" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.728310 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea5145c-0d08-4c85-984a-84c7e0820999" containerName="oc" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.728555 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea5145c-0d08-4c85-984a-84c7e0820999" containerName="oc" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.730108 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.748323 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.903398 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.903492 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:37 crc kubenswrapper[4816]: I0311 12:32:37.903531 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.005682 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.005996 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.006050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.006460 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.006600 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.040340 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") pod \"redhat-operators-mhflg\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.058332 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.651783 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.891393 4816 generic.go:334] "Generic (PLEG): container finished" podID="589c0cba-36d0-4224-ab81-dfcef6906331" containerID="3aa7c0e071b2d12f65dd8afa57fdda72f32fe235d19f86d4821e81596b958166" exitCode=0 Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.891447 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerDied","Data":"3aa7c0e071b2d12f65dd8afa57fdda72f32fe235d19f86d4821e81596b958166"} Mar 11 12:32:38 crc kubenswrapper[4816]: I0311 12:32:38.891479 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerStarted","Data":"7600999d30e2047fb4c28c092d759410853359befac2bdcbf26a0353c15bbbaf"} Mar 11 12:32:39 crc kubenswrapper[4816]: I0311 12:32:39.514706 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:32:39 crc kubenswrapper[4816]: I0311 12:32:39.515072 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:32:39 crc kubenswrapper[4816]: I0311 12:32:39.902432 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerStarted","Data":"d7bbfc360686ecb08719d6c8765ab786422bed05f456eaff1e04e0e820cb0cad"} Mar 11 12:32:40 crc kubenswrapper[4816]: I0311 12:32:40.910295 4816 generic.go:334] "Generic (PLEG): container finished" podID="589c0cba-36d0-4224-ab81-dfcef6906331" containerID="d7bbfc360686ecb08719d6c8765ab786422bed05f456eaff1e04e0e820cb0cad" exitCode=0 Mar 11 12:32:40 crc kubenswrapper[4816]: I0311 12:32:40.910433 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerDied","Data":"d7bbfc360686ecb08719d6c8765ab786422bed05f456eaff1e04e0e820cb0cad"} Mar 11 12:32:41 crc kubenswrapper[4816]: I0311 12:32:41.921178 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerStarted","Data":"41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b"} Mar 11 12:32:41 crc kubenswrapper[4816]: I0311 12:32:41.948481 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mhflg" podStartSLOduration=2.472274317 podStartE2EDuration="4.948451777s" podCreationTimestamp="2026-03-11 12:32:37 +0000 UTC" firstStartedPulling="2026-03-11 12:32:38.893232812 +0000 UTC m=+2045.484496779" lastFinishedPulling="2026-03-11 12:32:41.369410252 +0000 UTC m=+2047.960674239" observedRunningTime="2026-03-11 12:32:41.938674964 +0000 UTC m=+2048.529938941" watchObservedRunningTime="2026-03-11 12:32:41.948451777 +0000 UTC m=+2048.539715754" Mar 11 12:32:47 crc kubenswrapper[4816]: I0311 12:32:47.425820 4816 scope.go:117] "RemoveContainer" containerID="b6abd0eb57a8c7686ba5edd206e023c171a5ca0fee490573bab4f684459e8043" Mar 11 12:32:48 crc kubenswrapper[4816]: I0311 12:32:48.058769 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:48 crc kubenswrapper[4816]: I0311 12:32:48.058827 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:48 crc kubenswrapper[4816]: I0311 12:32:48.098317 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:49 crc kubenswrapper[4816]: I0311 12:32:49.043997 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:49 crc kubenswrapper[4816]: I0311 12:32:49.107500 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:51 crc kubenswrapper[4816]: I0311 12:32:51.016435 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mhflg" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="registry-server" containerID="cri-o://41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b" gracePeriod=2 Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.027076 4816 generic.go:334] "Generic (PLEG): container finished" podID="589c0cba-36d0-4224-ab81-dfcef6906331" containerID="41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b" exitCode=0 Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.027149 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerDied","Data":"41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b"} Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.510654 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.654203 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") pod \"589c0cba-36d0-4224-ab81-dfcef6906331\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.655054 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") pod \"589c0cba-36d0-4224-ab81-dfcef6906331\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.655168 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") pod \"589c0cba-36d0-4224-ab81-dfcef6906331\" (UID: \"589c0cba-36d0-4224-ab81-dfcef6906331\") " Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.656414 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities" (OuterVolumeSpecName: "utilities") pod "589c0cba-36d0-4224-ab81-dfcef6906331" (UID: "589c0cba-36d0-4224-ab81-dfcef6906331"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.661320 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95" (OuterVolumeSpecName: "kube-api-access-gwj95") pod "589c0cba-36d0-4224-ab81-dfcef6906331" (UID: "589c0cba-36d0-4224-ab81-dfcef6906331"). InnerVolumeSpecName "kube-api-access-gwj95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.756738 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.756788 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwj95\" (UniqueName: \"kubernetes.io/projected/589c0cba-36d0-4224-ab81-dfcef6906331-kube-api-access-gwj95\") on node \"crc\" DevicePath \"\"" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.821106 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "589c0cba-36d0-4224-ab81-dfcef6906331" (UID: "589c0cba-36d0-4224-ab81-dfcef6906331"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:32:52 crc kubenswrapper[4816]: I0311 12:32:52.857876 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c0cba-36d0-4224-ab81-dfcef6906331-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.040186 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhflg" event={"ID":"589c0cba-36d0-4224-ab81-dfcef6906331","Type":"ContainerDied","Data":"7600999d30e2047fb4c28c092d759410853359befac2bdcbf26a0353c15bbbaf"} Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.040223 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhflg" Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.040279 4816 scope.go:117] "RemoveContainer" containerID="41b9a433ab68895783b085e7821f621ee0e83d4fb07214cce70a87e38938a02b" Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.062707 4816 scope.go:117] "RemoveContainer" containerID="d7bbfc360686ecb08719d6c8765ab786422bed05f456eaff1e04e0e820cb0cad" Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.085122 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.095823 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mhflg"] Mar 11 12:32:53 crc kubenswrapper[4816]: I0311 12:32:53.100360 4816 scope.go:117] "RemoveContainer" containerID="3aa7c0e071b2d12f65dd8afa57fdda72f32fe235d19f86d4821e81596b958166" Mar 11 12:32:54 crc kubenswrapper[4816]: I0311 12:32:54.143183 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" path="/var/lib/kubelet/pods/589c0cba-36d0-4224-ab81-dfcef6906331/volumes" Mar 11 12:33:09 crc kubenswrapper[4816]: I0311 12:33:09.515502 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:33:09 crc kubenswrapper[4816]: I0311 12:33:09.516039 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.967304 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:13 crc kubenswrapper[4816]: E0311 12:33:13.968821 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="extract-utilities" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.968862 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="extract-utilities" Mar 11 12:33:13 crc kubenswrapper[4816]: E0311 12:33:13.968913 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="registry-server" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.968922 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="registry-server" Mar 11 12:33:13 crc kubenswrapper[4816]: E0311 12:33:13.968946 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="extract-content" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.968954 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="extract-content" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.969274 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="589c0cba-36d0-4224-ab81-dfcef6906331" containerName="registry-server" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.970672 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:13 crc kubenswrapper[4816]: I0311 12:33:13.984801 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.096386 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.096572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.096770 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198016 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198143 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198166 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198740 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.198832 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.218895 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") pod \"certified-operators-r2pkt\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.303571 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:14 crc kubenswrapper[4816]: I0311 12:33:14.599699 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:15 crc kubenswrapper[4816]: I0311 12:33:15.258016 4816 generic.go:334] "Generic (PLEG): container finished" podID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerID="8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44" exitCode=0 Mar 11 12:33:15 crc kubenswrapper[4816]: I0311 12:33:15.258130 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerDied","Data":"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44"} Mar 11 12:33:15 crc kubenswrapper[4816]: I0311 12:33:15.258439 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerStarted","Data":"0033fa1237cf2aca43feef20e61c599f42d60351aa258570a9bcddf0b7f1affe"} Mar 11 12:33:15 crc kubenswrapper[4816]: I0311 12:33:15.261198 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:33:16 crc kubenswrapper[4816]: I0311 12:33:16.267608 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerStarted","Data":"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b"} Mar 11 12:33:17 crc kubenswrapper[4816]: I0311 12:33:17.286384 4816 generic.go:334] "Generic (PLEG): container finished" podID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerID="c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b" exitCode=0 Mar 11 12:33:17 crc kubenswrapper[4816]: I0311 12:33:17.286463 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerDied","Data":"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b"} Mar 11 12:33:18 crc kubenswrapper[4816]: I0311 12:33:18.297922 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerStarted","Data":"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8"} Mar 11 12:33:18 crc kubenswrapper[4816]: I0311 12:33:18.321665 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r2pkt" podStartSLOduration=2.822032192 podStartE2EDuration="5.321640546s" podCreationTimestamp="2026-03-11 12:33:13 +0000 UTC" firstStartedPulling="2026-03-11 12:33:15.260772013 +0000 UTC m=+2081.852035990" lastFinishedPulling="2026-03-11 12:33:17.760380377 +0000 UTC m=+2084.351644344" observedRunningTime="2026-03-11 12:33:18.316363039 +0000 UTC m=+2084.907627016" watchObservedRunningTime="2026-03-11 12:33:18.321640546 +0000 UTC m=+2084.912904513" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.304166 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.304721 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.369218 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.414451 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:24 crc kubenswrapper[4816]: I0311 12:33:24.607604 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.356644 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r2pkt" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="registry-server" containerID="cri-o://7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" gracePeriod=2 Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.731932 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.801324 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") pod \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.801468 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") pod \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.801550 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") pod \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\" (UID: \"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1\") " Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.802405 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities" (OuterVolumeSpecName: "utilities") pod "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" (UID: "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.809541 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w" (OuterVolumeSpecName: "kube-api-access-gkk4w") pod "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" (UID: "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1"). InnerVolumeSpecName "kube-api-access-gkk4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.863148 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" (UID: "fabbe121-d60e-4b29-8f0b-cd0e8fce41c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.903502 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.903549 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkk4w\" (UniqueName: \"kubernetes.io/projected/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-kube-api-access-gkk4w\") on node \"crc\" DevicePath \"\"" Mar 11 12:33:26 crc kubenswrapper[4816]: I0311 12:33:26.903570 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.365959 4816 generic.go:334] "Generic (PLEG): container finished" podID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerID="7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" exitCode=0 Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.366010 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerDied","Data":"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8"} Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.366044 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2pkt" event={"ID":"fabbe121-d60e-4b29-8f0b-cd0e8fce41c1","Type":"ContainerDied","Data":"0033fa1237cf2aca43feef20e61c599f42d60351aa258570a9bcddf0b7f1affe"} Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.366045 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2pkt" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.366082 4816 scope.go:117] "RemoveContainer" containerID="7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.384962 4816 scope.go:117] "RemoveContainer" containerID="c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.400761 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.405514 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r2pkt"] Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.417964 4816 scope.go:117] "RemoveContainer" containerID="8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.440995 4816 scope.go:117] "RemoveContainer" containerID="7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" Mar 11 12:33:27 crc kubenswrapper[4816]: E0311 12:33:27.441623 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8\": container with ID starting with 7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8 not found: ID does not exist" containerID="7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.441696 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8"} err="failed to get container status \"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8\": rpc error: code = NotFound desc = could not find container \"7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8\": container with ID starting with 7eb2efe676c910f918a62ef446a9d6d14650804e2752ffade96c8e611a5da2c8 not found: ID does not exist" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.441736 4816 scope.go:117] "RemoveContainer" containerID="c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b" Mar 11 12:33:27 crc kubenswrapper[4816]: E0311 12:33:27.442166 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b\": container with ID starting with c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b not found: ID does not exist" containerID="c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.442300 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b"} err="failed to get container status \"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b\": rpc error: code = NotFound desc = could not find container \"c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b\": container with ID starting with c69b0b30da36e251f8021e2ef6670c286315f8e84ceed75bdb53f2f60d68004b not found: ID does not exist" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.442392 4816 scope.go:117] "RemoveContainer" containerID="8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44" Mar 11 12:33:27 crc kubenswrapper[4816]: E0311 12:33:27.442731 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44\": container with ID starting with 8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44 not found: ID does not exist" containerID="8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44" Mar 11 12:33:27 crc kubenswrapper[4816]: I0311 12:33:27.442760 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44"} err="failed to get container status \"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44\": rpc error: code = NotFound desc = could not find container \"8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44\": container with ID starting with 8f3894a4899757361634273dd9fe1f17eb732cbd5f15a9b2264dcaf1a22aab44 not found: ID does not exist" Mar 11 12:33:28 crc kubenswrapper[4816]: I0311 12:33:28.141926 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" path="/var/lib/kubelet/pods/fabbe121-d60e-4b29-8f0b-cd0e8fce41c1/volumes" Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.515431 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.516190 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.516323 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.517360 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:33:39 crc kubenswrapper[4816]: I0311 12:33:39.517474 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9" gracePeriod=600 Mar 11 12:33:40 crc kubenswrapper[4816]: I0311 12:33:40.480703 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9" exitCode=0 Mar 11 12:33:40 crc kubenswrapper[4816]: I0311 12:33:40.480806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9"} Mar 11 12:33:40 crc kubenswrapper[4816]: I0311 12:33:40.481092 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad"} Mar 11 12:33:40 crc kubenswrapper[4816]: I0311 12:33:40.481121 4816 scope.go:117] "RemoveContainer" containerID="0a8b5ab78ef936e6a7f5695a077be1086a9b179bfce7660cdc94066fe0301ea3" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.973039 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:33:55 crc kubenswrapper[4816]: E0311 12:33:55.973930 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="extract-utilities" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.973946 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="extract-utilities" Mar 11 12:33:55 crc kubenswrapper[4816]: E0311 12:33:55.973958 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="registry-server" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.973964 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="registry-server" Mar 11 12:33:55 crc kubenswrapper[4816]: E0311 12:33:55.973982 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="extract-content" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.973990 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="extract-content" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.974147 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabbe121-d60e-4b29-8f0b-cd0e8fce41c1" containerName="registry-server" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.975203 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:55 crc kubenswrapper[4816]: I0311 12:33:55.989833 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.072529 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.072679 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.072723 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.173839 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.173920 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.173956 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.174461 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.174586 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.192799 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") pod \"community-operators-7qdkl\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.332810 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:33:56 crc kubenswrapper[4816]: I0311 12:33:56.807107 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:33:56 crc kubenswrapper[4816]: W0311 12:33:56.819487 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42226e5d_abc0_4f65_a104_31582febe5fb.slice/crio-0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3 WatchSource:0}: Error finding container 0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3: Status 404 returned error can't find the container with id 0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3 Mar 11 12:33:57 crc kubenswrapper[4816]: I0311 12:33:57.630778 4816 generic.go:334] "Generic (PLEG): container finished" podID="42226e5d-abc0-4f65-a104-31582febe5fb" containerID="8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b" exitCode=0 Mar 11 12:33:57 crc kubenswrapper[4816]: I0311 12:33:57.630889 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerDied","Data":"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b"} Mar 11 12:33:57 crc kubenswrapper[4816]: I0311 12:33:57.631234 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerStarted","Data":"0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3"} Mar 11 12:33:58 crc kubenswrapper[4816]: I0311 12:33:58.642711 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerStarted","Data":"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155"} Mar 11 12:33:59 crc kubenswrapper[4816]: I0311 12:33:59.653096 4816 generic.go:334] "Generic (PLEG): container finished" podID="42226e5d-abc0-4f65-a104-31582febe5fb" containerID="43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155" exitCode=0 Mar 11 12:33:59 crc kubenswrapper[4816]: I0311 12:33:59.653160 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerDied","Data":"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155"} Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.147597 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.149295 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.152193 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.152410 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.152587 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.154081 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.240511 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") pod \"auto-csr-approver-29553874-gxdm8\" (UID: \"37fd63f9-b7c1-4900-a6c1-269f771958b1\") " pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.342006 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") pod \"auto-csr-approver-29553874-gxdm8\" (UID: \"37fd63f9-b7c1-4900-a6c1-269f771958b1\") " pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.363710 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") pod \"auto-csr-approver-29553874-gxdm8\" (UID: \"37fd63f9-b7c1-4900-a6c1-269f771958b1\") " pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.477154 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.666350 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerStarted","Data":"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e"} Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.693328 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7qdkl" podStartSLOduration=3.032208965 podStartE2EDuration="5.693228998s" podCreationTimestamp="2026-03-11 12:33:55 +0000 UTC" firstStartedPulling="2026-03-11 12:33:57.632960182 +0000 UTC m=+2124.224224149" lastFinishedPulling="2026-03-11 12:34:00.293980215 +0000 UTC m=+2126.885244182" observedRunningTime="2026-03-11 12:34:00.68647409 +0000 UTC m=+2127.277738077" watchObservedRunningTime="2026-03-11 12:34:00.693228998 +0000 UTC m=+2127.284492965" Mar 11 12:34:00 crc kubenswrapper[4816]: I0311 12:34:00.891401 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:34:00 crc kubenswrapper[4816]: W0311 12:34:00.893959 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37fd63f9_b7c1_4900_a6c1_269f771958b1.slice/crio-e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0 WatchSource:0}: Error finding container e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0: Status 404 returned error can't find the container with id e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0 Mar 11 12:34:01 crc kubenswrapper[4816]: I0311 12:34:01.681596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" event={"ID":"37fd63f9-b7c1-4900-a6c1-269f771958b1","Type":"ContainerStarted","Data":"e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0"} Mar 11 12:34:02 crc kubenswrapper[4816]: I0311 12:34:02.690984 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" event={"ID":"37fd63f9-b7c1-4900-a6c1-269f771958b1","Type":"ContainerStarted","Data":"372011e32b19c2edb6c819b3e68c03b814d1f5c3bd84d6d127863a08cf21f878"} Mar 11 12:34:03 crc kubenswrapper[4816]: I0311 12:34:03.702048 4816 generic.go:334] "Generic (PLEG): container finished" podID="37fd63f9-b7c1-4900-a6c1-269f771958b1" containerID="372011e32b19c2edb6c819b3e68c03b814d1f5c3bd84d6d127863a08cf21f878" exitCode=0 Mar 11 12:34:03 crc kubenswrapper[4816]: I0311 12:34:03.702118 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" event={"ID":"37fd63f9-b7c1-4900-a6c1-269f771958b1","Type":"ContainerDied","Data":"372011e32b19c2edb6c819b3e68c03b814d1f5c3bd84d6d127863a08cf21f878"} Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.021440 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.120824 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") pod \"37fd63f9-b7c1-4900-a6c1-269f771958b1\" (UID: \"37fd63f9-b7c1-4900-a6c1-269f771958b1\") " Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.136762 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8" (OuterVolumeSpecName: "kube-api-access-g7gx8") pod "37fd63f9-b7c1-4900-a6c1-269f771958b1" (UID: "37fd63f9-b7c1-4900-a6c1-269f771958b1"). InnerVolumeSpecName "kube-api-access-g7gx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.224756 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7gx8\" (UniqueName: \"kubernetes.io/projected/37fd63f9-b7c1-4900-a6c1-269f771958b1-kube-api-access-g7gx8\") on node \"crc\" DevicePath \"\"" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.720932 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" event={"ID":"37fd63f9-b7c1-4900-a6c1-269f771958b1","Type":"ContainerDied","Data":"e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0"} Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.720982 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2e8965efc63e031283f92ec8887043b338e778752affced077e6e1d457d93b0" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.721026 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553874-gxdm8" Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.787992 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:34:05 crc kubenswrapper[4816]: I0311 12:34:05.794863 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553868-qmhvt"] Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.141272 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf" path="/var/lib/kubelet/pods/7828a3bd-8864-4d7a-a6fa-ff3ac4e607bf/volumes" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.333574 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.333673 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.377218 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.771895 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:06 crc kubenswrapper[4816]: I0311 12:34:06.823408 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:34:08 crc kubenswrapper[4816]: I0311 12:34:08.743437 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7qdkl" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="registry-server" containerID="cri-o://44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" gracePeriod=2 Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.245378 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.398167 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") pod \"42226e5d-abc0-4f65-a104-31582febe5fb\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.398235 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") pod \"42226e5d-abc0-4f65-a104-31582febe5fb\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.398297 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") pod \"42226e5d-abc0-4f65-a104-31582febe5fb\" (UID: \"42226e5d-abc0-4f65-a104-31582febe5fb\") " Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.399737 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities" (OuterVolumeSpecName: "utilities") pod "42226e5d-abc0-4f65-a104-31582febe5fb" (UID: "42226e5d-abc0-4f65-a104-31582febe5fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.411600 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q" (OuterVolumeSpecName: "kube-api-access-mmd5q") pod "42226e5d-abc0-4f65-a104-31582febe5fb" (UID: "42226e5d-abc0-4f65-a104-31582febe5fb"). InnerVolumeSpecName "kube-api-access-mmd5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.500751 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmd5q\" (UniqueName: \"kubernetes.io/projected/42226e5d-abc0-4f65-a104-31582febe5fb-kube-api-access-mmd5q\") on node \"crc\" DevicePath \"\"" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.500809 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.512784 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42226e5d-abc0-4f65-a104-31582febe5fb" (UID: "42226e5d-abc0-4f65-a104-31582febe5fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.602083 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42226e5d-abc0-4f65-a104-31582febe5fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757267 4816 generic.go:334] "Generic (PLEG): container finished" podID="42226e5d-abc0-4f65-a104-31582febe5fb" containerID="44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" exitCode=0 Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757352 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qdkl" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757365 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerDied","Data":"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e"} Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757483 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qdkl" event={"ID":"42226e5d-abc0-4f65-a104-31582febe5fb","Type":"ContainerDied","Data":"0a3e1b9bdac2862d2e386afd65a93928dc918f71af0c866ccae7ff804dfca7d3"} Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.757528 4816 scope.go:117] "RemoveContainer" containerID="44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.783842 4816 scope.go:117] "RemoveContainer" containerID="43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.801934 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.811468 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7qdkl"] Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.831705 4816 scope.go:117] "RemoveContainer" containerID="8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.863373 4816 scope.go:117] "RemoveContainer" containerID="44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" Mar 11 12:34:09 crc kubenswrapper[4816]: E0311 12:34:09.864158 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e\": container with ID starting with 44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e not found: ID does not exist" containerID="44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.864301 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e"} err="failed to get container status \"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e\": rpc error: code = NotFound desc = could not find container \"44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e\": container with ID starting with 44551a0ab786e673fc970db35e599ddd8fc182f55a9cb9f0d1b0fb3fd65e486e not found: ID does not exist" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.864372 4816 scope.go:117] "RemoveContainer" containerID="43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155" Mar 11 12:34:09 crc kubenswrapper[4816]: E0311 12:34:09.865229 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155\": container with ID starting with 43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155 not found: ID does not exist" containerID="43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.865326 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155"} err="failed to get container status \"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155\": rpc error: code = NotFound desc = could not find container \"43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155\": container with ID starting with 43a1fea604ae9d7a01a28b9305a4970702dcbf2f2d004b5e131f9b54bd525155 not found: ID does not exist" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.865374 4816 scope.go:117] "RemoveContainer" containerID="8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b" Mar 11 12:34:09 crc kubenswrapper[4816]: E0311 12:34:09.865793 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b\": container with ID starting with 8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b not found: ID does not exist" containerID="8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b" Mar 11 12:34:09 crc kubenswrapper[4816]: I0311 12:34:09.865865 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b"} err="failed to get container status \"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b\": rpc error: code = NotFound desc = could not find container \"8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b\": container with ID starting with 8983ba5b36e1f28d425ff80ec8440f84b6129945aebd6c4f111d84202cb1c00b not found: ID does not exist" Mar 11 12:34:10 crc kubenswrapper[4816]: I0311 12:34:10.140189 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" path="/var/lib/kubelet/pods/42226e5d-abc0-4f65-a104-31582febe5fb/volumes" Mar 11 12:34:47 crc kubenswrapper[4816]: I0311 12:34:47.562819 4816 scope.go:117] "RemoveContainer" containerID="6f896b214f33da369f143727ecfdb3b64f134749ec6e50337a2f62ef03d15c62" Mar 11 12:35:39 crc kubenswrapper[4816]: I0311 12:35:39.518764 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:35:39 crc kubenswrapper[4816]: I0311 12:35:39.519830 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.535433 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:35:57 crc kubenswrapper[4816]: E0311 12:35:57.536465 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37fd63f9-b7c1-4900-a6c1-269f771958b1" containerName="oc" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536486 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="37fd63f9-b7c1-4900-a6c1-269f771958b1" containerName="oc" Mar 11 12:35:57 crc kubenswrapper[4816]: E0311 12:35:57.536518 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="extract-content" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536529 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="extract-content" Mar 11 12:35:57 crc kubenswrapper[4816]: E0311 12:35:57.536544 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="registry-server" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536553 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="registry-server" Mar 11 12:35:57 crc kubenswrapper[4816]: E0311 12:35:57.536564 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="extract-utilities" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536573 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="extract-utilities" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536735 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="42226e5d-abc0-4f65-a104-31582febe5fb" containerName="registry-server" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.536758 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="37fd63f9-b7c1-4900-a6c1-269f771958b1" containerName="oc" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.538336 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.553013 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.676628 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.676679 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.676797 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778042 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778186 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778215 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778702 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.778753 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.802664 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") pod \"redhat-marketplace-hrg8t\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:57 crc kubenswrapper[4816]: I0311 12:35:57.875453 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:35:58 crc kubenswrapper[4816]: I0311 12:35:58.307193 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:35:58 crc kubenswrapper[4816]: I0311 12:35:58.673098 4816 generic.go:334] "Generic (PLEG): container finished" podID="16cec869-9798-4a51-b950-59a57dfa3c37" containerID="f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0" exitCode=0 Mar 11 12:35:58 crc kubenswrapper[4816]: I0311 12:35:58.673562 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerDied","Data":"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0"} Mar 11 12:35:58 crc kubenswrapper[4816]: I0311 12:35:58.673618 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerStarted","Data":"a088adcbed3f980d1c239b585dfbe0befdd15b0a6eaae124e57dfe197c46e993"} Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.150818 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.152822 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.157191 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.176526 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.176883 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.177000 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.315758 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") pod \"auto-csr-approver-29553876-55nmf\" (UID: \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\") " pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.418951 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") pod \"auto-csr-approver-29553876-55nmf\" (UID: \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\") " pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.445169 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") pod \"auto-csr-approver-29553876-55nmf\" (UID: \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\") " pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.536999 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.698840 4816 generic.go:334] "Generic (PLEG): container finished" podID="16cec869-9798-4a51-b950-59a57dfa3c37" containerID="0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c" exitCode=0 Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.699596 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerDied","Data":"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c"} Mar 11 12:36:00 crc kubenswrapper[4816]: I0311 12:36:00.792033 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:36:01 crc kubenswrapper[4816]: I0311 12:36:01.709915 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerStarted","Data":"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa"} Mar 11 12:36:01 crc kubenswrapper[4816]: I0311 12:36:01.715438 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553876-55nmf" event={"ID":"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f","Type":"ContainerStarted","Data":"4e35de0190418478e45846aa2c23f6f00d64b785f24078cec25aba7b5c721f35"} Mar 11 12:36:01 crc kubenswrapper[4816]: I0311 12:36:01.743144 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hrg8t" podStartSLOduration=2.214650745 podStartE2EDuration="4.743116696s" podCreationTimestamp="2026-03-11 12:35:57 +0000 UTC" firstStartedPulling="2026-03-11 12:35:58.675167204 +0000 UTC m=+2245.266431171" lastFinishedPulling="2026-03-11 12:36:01.203633155 +0000 UTC m=+2247.794897122" observedRunningTime="2026-03-11 12:36:01.734008501 +0000 UTC m=+2248.325272488" watchObservedRunningTime="2026-03-11 12:36:01.743116696 +0000 UTC m=+2248.334380683" Mar 11 12:36:02 crc kubenswrapper[4816]: I0311 12:36:02.725158 4816 generic.go:334] "Generic (PLEG): container finished" podID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" containerID="fcc96a304b12ffc267c89c3d4b3f056b4d2e01821a0ebbb16c1bdcf350072143" exitCode=0 Mar 11 12:36:02 crc kubenswrapper[4816]: I0311 12:36:02.725670 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553876-55nmf" event={"ID":"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f","Type":"ContainerDied","Data":"fcc96a304b12ffc267c89c3d4b3f056b4d2e01821a0ebbb16c1bdcf350072143"} Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.010140 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.182329 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") pod \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\" (UID: \"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f\") " Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.212845 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc" (OuterVolumeSpecName: "kube-api-access-l44dc") pod "feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" (UID: "feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f"). InnerVolumeSpecName "kube-api-access-l44dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.284713 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l44dc\" (UniqueName: \"kubernetes.io/projected/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f-kube-api-access-l44dc\") on node \"crc\" DevicePath \"\"" Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.749927 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553876-55nmf" event={"ID":"feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f","Type":"ContainerDied","Data":"4e35de0190418478e45846aa2c23f6f00d64b785f24078cec25aba7b5c721f35"} Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.749991 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e35de0190418478e45846aa2c23f6f00d64b785f24078cec25aba7b5c721f35" Mar 11 12:36:04 crc kubenswrapper[4816]: I0311 12:36:04.750137 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553876-55nmf" Mar 11 12:36:05 crc kubenswrapper[4816]: I0311 12:36:05.098819 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:36:05 crc kubenswrapper[4816]: I0311 12:36:05.107602 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553870-5ndrc"] Mar 11 12:36:06 crc kubenswrapper[4816]: I0311 12:36:06.143216 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2920f68a-c5bb-474c-929b-09ced109bcc0" path="/var/lib/kubelet/pods/2920f68a-c5bb-474c-929b-09ced109bcc0/volumes" Mar 11 12:36:07 crc kubenswrapper[4816]: I0311 12:36:07.875607 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:07 crc kubenswrapper[4816]: I0311 12:36:07.875826 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:07 crc kubenswrapper[4816]: I0311 12:36:07.956061 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:08 crc kubenswrapper[4816]: I0311 12:36:08.846730 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:08 crc kubenswrapper[4816]: I0311 12:36:08.905735 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:36:09 crc kubenswrapper[4816]: I0311 12:36:09.515852 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:36:09 crc kubenswrapper[4816]: I0311 12:36:09.516417 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:36:10 crc kubenswrapper[4816]: I0311 12:36:10.811371 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hrg8t" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="registry-server" containerID="cri-o://16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" gracePeriod=2 Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.303898 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.419651 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") pod \"16cec869-9798-4a51-b950-59a57dfa3c37\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.419921 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") pod \"16cec869-9798-4a51-b950-59a57dfa3c37\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.419975 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") pod \"16cec869-9798-4a51-b950-59a57dfa3c37\" (UID: \"16cec869-9798-4a51-b950-59a57dfa3c37\") " Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.420612 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities" (OuterVolumeSpecName: "utilities") pod "16cec869-9798-4a51-b950-59a57dfa3c37" (UID: "16cec869-9798-4a51-b950-59a57dfa3c37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.427670 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds" (OuterVolumeSpecName: "kube-api-access-kkhds") pod "16cec869-9798-4a51-b950-59a57dfa3c37" (UID: "16cec869-9798-4a51-b950-59a57dfa3c37"). InnerVolumeSpecName "kube-api-access-kkhds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.461374 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16cec869-9798-4a51-b950-59a57dfa3c37" (UID: "16cec869-9798-4a51-b950-59a57dfa3c37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.521862 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkhds\" (UniqueName: \"kubernetes.io/projected/16cec869-9798-4a51-b950-59a57dfa3c37-kube-api-access-kkhds\") on node \"crc\" DevicePath \"\"" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.521906 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.521919 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16cec869-9798-4a51-b950-59a57dfa3c37-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822504 4816 generic.go:334] "Generic (PLEG): container finished" podID="16cec869-9798-4a51-b950-59a57dfa3c37" containerID="16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" exitCode=0 Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822578 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerDied","Data":"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa"} Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822606 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrg8t" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822639 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrg8t" event={"ID":"16cec869-9798-4a51-b950-59a57dfa3c37","Type":"ContainerDied","Data":"a088adcbed3f980d1c239b585dfbe0befdd15b0a6eaae124e57dfe197c46e993"} Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.822677 4816 scope.go:117] "RemoveContainer" containerID="16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.849676 4816 scope.go:117] "RemoveContainer" containerID="0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.883798 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.884486 4816 scope.go:117] "RemoveContainer" containerID="f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.896175 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrg8t"] Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.926944 4816 scope.go:117] "RemoveContainer" containerID="16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" Mar 11 12:36:11 crc kubenswrapper[4816]: E0311 12:36:11.927628 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa\": container with ID starting with 16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa not found: ID does not exist" containerID="16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.927671 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa"} err="failed to get container status \"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa\": rpc error: code = NotFound desc = could not find container \"16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa\": container with ID starting with 16d24ba0c5d748973ba12af27011a40d6d8c1a614a1d7b1d91fd2e2b2b62d5aa not found: ID does not exist" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.927710 4816 scope.go:117] "RemoveContainer" containerID="0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c" Mar 11 12:36:11 crc kubenswrapper[4816]: E0311 12:36:11.928277 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c\": container with ID starting with 0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c not found: ID does not exist" containerID="0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.928352 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c"} err="failed to get container status \"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c\": rpc error: code = NotFound desc = could not find container \"0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c\": container with ID starting with 0d48faa8a89c7695f1cc4aa05889c97c7e77ae5de899328361c4bed02fe9144c not found: ID does not exist" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.928399 4816 scope.go:117] "RemoveContainer" containerID="f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0" Mar 11 12:36:11 crc kubenswrapper[4816]: E0311 12:36:11.929052 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0\": container with ID starting with f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0 not found: ID does not exist" containerID="f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0" Mar 11 12:36:11 crc kubenswrapper[4816]: I0311 12:36:11.929087 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0"} err="failed to get container status \"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0\": rpc error: code = NotFound desc = could not find container \"f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0\": container with ID starting with f832c5c127e51cab2092c0ee970fb07c751c81b54b216bfcc2b98bc4378703d0 not found: ID does not exist" Mar 11 12:36:12 crc kubenswrapper[4816]: I0311 12:36:12.145309 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" path="/var/lib/kubelet/pods/16cec869-9798-4a51-b950-59a57dfa3c37/volumes" Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.515902 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.516832 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.516911 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.517720 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:36:39 crc kubenswrapper[4816]: I0311 12:36:39.517821 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" gracePeriod=600 Mar 11 12:36:39 crc kubenswrapper[4816]: E0311 12:36:39.641804 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:36:40 crc kubenswrapper[4816]: I0311 12:36:40.073439 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" exitCode=0 Mar 11 12:36:40 crc kubenswrapper[4816]: I0311 12:36:40.073518 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad"} Mar 11 12:36:40 crc kubenswrapper[4816]: I0311 12:36:40.073590 4816 scope.go:117] "RemoveContainer" containerID="106e2d5f907b914dfe49698bbb91ece73a062b224d2ba46fe31a9e998555b6c9" Mar 11 12:36:40 crc kubenswrapper[4816]: I0311 12:36:40.074775 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:36:40 crc kubenswrapper[4816]: E0311 12:36:40.075158 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:36:47 crc kubenswrapper[4816]: I0311 12:36:47.685294 4816 scope.go:117] "RemoveContainer" containerID="921a40cedc2e53aafc505227f80b7caf98ac145dd8c9d234ce2c285d7eb65e19" Mar 11 12:36:54 crc kubenswrapper[4816]: I0311 12:36:54.137122 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:36:54 crc kubenswrapper[4816]: E0311 12:36:54.138189 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:07 crc kubenswrapper[4816]: I0311 12:37:07.130841 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:07 crc kubenswrapper[4816]: E0311 12:37:07.131996 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:21 crc kubenswrapper[4816]: I0311 12:37:21.130455 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:21 crc kubenswrapper[4816]: E0311 12:37:21.131557 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:32 crc kubenswrapper[4816]: I0311 12:37:32.130205 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:32 crc kubenswrapper[4816]: E0311 12:37:32.131172 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:45 crc kubenswrapper[4816]: I0311 12:37:45.130572 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:45 crc kubenswrapper[4816]: E0311 12:37:45.131618 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:37:58 crc kubenswrapper[4816]: I0311 12:37:58.130960 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:37:58 crc kubenswrapper[4816]: E0311 12:37:58.131820 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.157481 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:38:00 crc kubenswrapper[4816]: E0311 12:38:00.158291 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="registry-server" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158311 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="registry-server" Mar 11 12:38:00 crc kubenswrapper[4816]: E0311 12:38:00.158336 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" containerName="oc" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158345 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" containerName="oc" Mar 11 12:38:00 crc kubenswrapper[4816]: E0311 12:38:00.158373 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="extract-utilities" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158383 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="extract-utilities" Mar 11 12:38:00 crc kubenswrapper[4816]: E0311 12:38:00.158405 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="extract-content" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158415 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="extract-content" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158615 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="16cec869-9798-4a51-b950-59a57dfa3c37" containerName="registry-server" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.158631 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" containerName="oc" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.159229 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.164304 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.165031 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.166897 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.167674 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.341659 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") pod \"auto-csr-approver-29553878-cp29z\" (UID: \"080a6096-67c3-41de-97d6-29a3f80c027e\") " pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.443117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") pod \"auto-csr-approver-29553878-cp29z\" (UID: \"080a6096-67c3-41de-97d6-29a3f80c027e\") " pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.465144 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") pod \"auto-csr-approver-29553878-cp29z\" (UID: \"080a6096-67c3-41de-97d6-29a3f80c027e\") " pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.487403 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:00 crc kubenswrapper[4816]: I0311 12:38:00.919524 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:38:01 crc kubenswrapper[4816]: I0311 12:38:01.761833 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553878-cp29z" event={"ID":"080a6096-67c3-41de-97d6-29a3f80c027e","Type":"ContainerStarted","Data":"9b194b24caa1f4ba114ffb2ee3ef4ae162c9b5c6400e727b6ed6cfc030d0acea"} Mar 11 12:38:02 crc kubenswrapper[4816]: I0311 12:38:02.771140 4816 generic.go:334] "Generic (PLEG): container finished" podID="080a6096-67c3-41de-97d6-29a3f80c027e" containerID="1e58ff6aa0fcfd993b576ab14b5750187bf29f39db71f93177979cba96a1d350" exitCode=0 Mar 11 12:38:02 crc kubenswrapper[4816]: I0311 12:38:02.771353 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553878-cp29z" event={"ID":"080a6096-67c3-41de-97d6-29a3f80c027e","Type":"ContainerDied","Data":"1e58ff6aa0fcfd993b576ab14b5750187bf29f39db71f93177979cba96a1d350"} Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.087878 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.204100 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") pod \"080a6096-67c3-41de-97d6-29a3f80c027e\" (UID: \"080a6096-67c3-41de-97d6-29a3f80c027e\") " Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.211375 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg" (OuterVolumeSpecName: "kube-api-access-h48lg") pod "080a6096-67c3-41de-97d6-29a3f80c027e" (UID: "080a6096-67c3-41de-97d6-29a3f80c027e"). InnerVolumeSpecName "kube-api-access-h48lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.306244 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/080a6096-67c3-41de-97d6-29a3f80c027e-kube-api-access-h48lg\") on node \"crc\" DevicePath \"\"" Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.792716 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553878-cp29z" event={"ID":"080a6096-67c3-41de-97d6-29a3f80c027e","Type":"ContainerDied","Data":"9b194b24caa1f4ba114ffb2ee3ef4ae162c9b5c6400e727b6ed6cfc030d0acea"} Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.792763 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b194b24caa1f4ba114ffb2ee3ef4ae162c9b5c6400e727b6ed6cfc030d0acea" Mar 11 12:38:04 crc kubenswrapper[4816]: I0311 12:38:04.793522 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553878-cp29z" Mar 11 12:38:05 crc kubenswrapper[4816]: I0311 12:38:05.190477 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:38:05 crc kubenswrapper[4816]: I0311 12:38:05.196142 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553872-r7v8g"] Mar 11 12:38:06 crc kubenswrapper[4816]: I0311 12:38:06.140548 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea5145c-0d08-4c85-984a-84c7e0820999" path="/var/lib/kubelet/pods/9ea5145c-0d08-4c85-984a-84c7e0820999/volumes" Mar 11 12:38:13 crc kubenswrapper[4816]: I0311 12:38:13.130699 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:38:13 crc kubenswrapper[4816]: E0311 12:38:13.131443 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:38:25 crc kubenswrapper[4816]: I0311 12:38:25.131122 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:38:25 crc kubenswrapper[4816]: E0311 12:38:25.132567 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:38:40 crc kubenswrapper[4816]: I0311 12:38:40.131638 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:38:40 crc kubenswrapper[4816]: E0311 12:38:40.133118 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:38:47 crc kubenswrapper[4816]: I0311 12:38:47.795768 4816 scope.go:117] "RemoveContainer" containerID="2595726d36e0a1d13282e231fa75cc95c6cd459575385b61b5632d40de6eac9f" Mar 11 12:38:53 crc kubenswrapper[4816]: I0311 12:38:53.131684 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:38:53 crc kubenswrapper[4816]: E0311 12:38:53.132570 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:39:07 crc kubenswrapper[4816]: I0311 12:39:07.131044 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:39:07 crc kubenswrapper[4816]: E0311 12:39:07.133490 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:39:21 crc kubenswrapper[4816]: I0311 12:39:21.132065 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:39:21 crc kubenswrapper[4816]: E0311 12:39:21.133445 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:39:36 crc kubenswrapper[4816]: I0311 12:39:36.131542 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:39:36 crc kubenswrapper[4816]: E0311 12:39:36.133237 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:39:51 crc kubenswrapper[4816]: I0311 12:39:51.131406 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:39:51 crc kubenswrapper[4816]: E0311 12:39:51.132743 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.144469 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:40:00 crc kubenswrapper[4816]: E0311 12:40:00.145502 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080a6096-67c3-41de-97d6-29a3f80c027e" containerName="oc" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.145523 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="080a6096-67c3-41de-97d6-29a3f80c027e" containerName="oc" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.145672 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="080a6096-67c3-41de-97d6-29a3f80c027e" containerName="oc" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.146291 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.151477 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.151762 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.151967 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.162647 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.223714 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") pod \"auto-csr-approver-29553880-9ks6n\" (UID: \"e47f911d-5bf4-4923-a6ac-95e98217fd25\") " pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.324862 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") pod \"auto-csr-approver-29553880-9ks6n\" (UID: \"e47f911d-5bf4-4923-a6ac-95e98217fd25\") " pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.344347 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") pod \"auto-csr-approver-29553880-9ks6n\" (UID: \"e47f911d-5bf4-4923-a6ac-95e98217fd25\") " pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.469047 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.918518 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.925865 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:40:00 crc kubenswrapper[4816]: I0311 12:40:00.983961 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" event={"ID":"e47f911d-5bf4-4923-a6ac-95e98217fd25","Type":"ContainerStarted","Data":"2be0847875d3b20f4ebfa981d256af6675af1bf3c93c41747bb16a77a188363f"} Mar 11 12:40:03 crc kubenswrapper[4816]: I0311 12:40:03.000894 4816 generic.go:334] "Generic (PLEG): container finished" podID="e47f911d-5bf4-4923-a6ac-95e98217fd25" containerID="f2c8244acc6c0aed95c31a23f8089006b34d5b7db0dcdad32b1e6365dd4fd124" exitCode=0 Mar 11 12:40:03 crc kubenswrapper[4816]: I0311 12:40:03.000964 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" event={"ID":"e47f911d-5bf4-4923-a6ac-95e98217fd25","Type":"ContainerDied","Data":"f2c8244acc6c0aed95c31a23f8089006b34d5b7db0dcdad32b1e6365dd4fd124"} Mar 11 12:40:04 crc kubenswrapper[4816]: I0311 12:40:04.306115 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:04 crc kubenswrapper[4816]: I0311 12:40:04.393218 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") pod \"e47f911d-5bf4-4923-a6ac-95e98217fd25\" (UID: \"e47f911d-5bf4-4923-a6ac-95e98217fd25\") " Mar 11 12:40:04 crc kubenswrapper[4816]: I0311 12:40:04.402585 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd" (OuterVolumeSpecName: "kube-api-access-9cgdd") pod "e47f911d-5bf4-4923-a6ac-95e98217fd25" (UID: "e47f911d-5bf4-4923-a6ac-95e98217fd25"). InnerVolumeSpecName "kube-api-access-9cgdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:40:04 crc kubenswrapper[4816]: I0311 12:40:04.495200 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cgdd\" (UniqueName: \"kubernetes.io/projected/e47f911d-5bf4-4923-a6ac-95e98217fd25-kube-api-access-9cgdd\") on node \"crc\" DevicePath \"\"" Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.017266 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" event={"ID":"e47f911d-5bf4-4923-a6ac-95e98217fd25","Type":"ContainerDied","Data":"2be0847875d3b20f4ebfa981d256af6675af1bf3c93c41747bb16a77a188363f"} Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.017317 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2be0847875d3b20f4ebfa981d256af6675af1bf3c93c41747bb16a77a188363f" Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.017359 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553880-9ks6n" Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.131074 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:05 crc kubenswrapper[4816]: E0311 12:40:05.131384 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.381565 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:40:05 crc kubenswrapper[4816]: I0311 12:40:05.387620 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553874-gxdm8"] Mar 11 12:40:06 crc kubenswrapper[4816]: I0311 12:40:06.140334 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37fd63f9-b7c1-4900-a6c1-269f771958b1" path="/var/lib/kubelet/pods/37fd63f9-b7c1-4900-a6c1-269f771958b1/volumes" Mar 11 12:40:18 crc kubenswrapper[4816]: I0311 12:40:18.130372 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:18 crc kubenswrapper[4816]: E0311 12:40:18.131283 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:29 crc kubenswrapper[4816]: I0311 12:40:29.131156 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:29 crc kubenswrapper[4816]: E0311 12:40:29.132292 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:42 crc kubenswrapper[4816]: I0311 12:40:42.131800 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:42 crc kubenswrapper[4816]: E0311 12:40:42.133195 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:40:47 crc kubenswrapper[4816]: I0311 12:40:47.914731 4816 scope.go:117] "RemoveContainer" containerID="372011e32b19c2edb6c819b3e68c03b814d1f5c3bd84d6d127863a08cf21f878" Mar 11 12:40:57 crc kubenswrapper[4816]: I0311 12:40:57.131999 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:40:57 crc kubenswrapper[4816]: E0311 12:40:57.133299 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:41:08 crc kubenswrapper[4816]: I0311 12:41:08.130980 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:41:08 crc kubenswrapper[4816]: E0311 12:41:08.132380 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:41:23 crc kubenswrapper[4816]: I0311 12:41:23.130282 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:41:23 crc kubenswrapper[4816]: E0311 12:41:23.131460 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:41:34 crc kubenswrapper[4816]: I0311 12:41:34.137076 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:41:34 crc kubenswrapper[4816]: E0311 12:41:34.138147 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:41:48 crc kubenswrapper[4816]: I0311 12:41:48.130551 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:41:48 crc kubenswrapper[4816]: I0311 12:41:48.964072 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211"} Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.171021 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:42:00 crc kubenswrapper[4816]: E0311 12:42:00.173502 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47f911d-5bf4-4923-a6ac-95e98217fd25" containerName="oc" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.173614 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47f911d-5bf4-4923-a6ac-95e98217fd25" containerName="oc" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.173899 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="e47f911d-5bf4-4923-a6ac-95e98217fd25" containerName="oc" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.174520 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.178303 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.178402 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.178846 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.189920 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.335954 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") pod \"auto-csr-approver-29553882-ggqkk\" (UID: \"5a88e2af-5e7d-4491-a32f-75a670aed689\") " pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.437873 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") pod \"auto-csr-approver-29553882-ggqkk\" (UID: \"5a88e2af-5e7d-4491-a32f-75a670aed689\") " pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.463440 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") pod \"auto-csr-approver-29553882-ggqkk\" (UID: \"5a88e2af-5e7d-4491-a32f-75a670aed689\") " pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.498507 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:00 crc kubenswrapper[4816]: I0311 12:42:00.971177 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:42:01 crc kubenswrapper[4816]: I0311 12:42:01.076143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" event={"ID":"5a88e2af-5e7d-4491-a32f-75a670aed689","Type":"ContainerStarted","Data":"650e6171e13f02dde6c15ca8c999cea145dc5871686afe0ccb3bed8c4e70a0a3"} Mar 11 12:42:03 crc kubenswrapper[4816]: I0311 12:42:03.093189 4816 generic.go:334] "Generic (PLEG): container finished" podID="5a88e2af-5e7d-4491-a32f-75a670aed689" containerID="c588ba0a9276d85151be0b86106d7b0f7a77bf5bc78e6ea0213f1a19b8ad671f" exitCode=0 Mar 11 12:42:03 crc kubenswrapper[4816]: I0311 12:42:03.093292 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" event={"ID":"5a88e2af-5e7d-4491-a32f-75a670aed689","Type":"ContainerDied","Data":"c588ba0a9276d85151be0b86106d7b0f7a77bf5bc78e6ea0213f1a19b8ad671f"} Mar 11 12:42:04 crc kubenswrapper[4816]: I0311 12:42:04.425310 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:04 crc kubenswrapper[4816]: I0311 12:42:04.543415 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") pod \"5a88e2af-5e7d-4491-a32f-75a670aed689\" (UID: \"5a88e2af-5e7d-4491-a32f-75a670aed689\") " Mar 11 12:42:04 crc kubenswrapper[4816]: I0311 12:42:04.549639 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np" (OuterVolumeSpecName: "kube-api-access-fp7np") pod "5a88e2af-5e7d-4491-a32f-75a670aed689" (UID: "5a88e2af-5e7d-4491-a32f-75a670aed689"). InnerVolumeSpecName "kube-api-access-fp7np". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:42:04 crc kubenswrapper[4816]: I0311 12:42:04.645112 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp7np\" (UniqueName: \"kubernetes.io/projected/5a88e2af-5e7d-4491-a32f-75a670aed689-kube-api-access-fp7np\") on node \"crc\" DevicePath \"\"" Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.111469 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" event={"ID":"5a88e2af-5e7d-4491-a32f-75a670aed689","Type":"ContainerDied","Data":"650e6171e13f02dde6c15ca8c999cea145dc5871686afe0ccb3bed8c4e70a0a3"} Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.111524 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="650e6171e13f02dde6c15ca8c999cea145dc5871686afe0ccb3bed8c4e70a0a3" Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.111558 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553882-ggqkk" Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.507344 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:42:05 crc kubenswrapper[4816]: I0311 12:42:05.515062 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553876-55nmf"] Mar 11 12:42:06 crc kubenswrapper[4816]: I0311 12:42:06.144870 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f" path="/var/lib/kubelet/pods/feb8451d-15b3-4ed2-815e-8aa5c1d8bc7f/volumes" Mar 11 12:42:48 crc kubenswrapper[4816]: I0311 12:42:48.057109 4816 scope.go:117] "RemoveContainer" containerID="fcc96a304b12ffc267c89c3d4b3f056b4d2e01821a0ebbb16c1bdcf350072143" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.511701 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:04 crc kubenswrapper[4816]: E0311 12:43:04.519668 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a88e2af-5e7d-4491-a32f-75a670aed689" containerName="oc" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.519860 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a88e2af-5e7d-4491-a32f-75a670aed689" containerName="oc" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.520841 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a88e2af-5e7d-4491-a32f-75a670aed689" containerName="oc" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.523665 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.530834 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.620443 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.620485 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.620543 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.722195 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.722300 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.722402 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.723035 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.723092 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.748631 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") pod \"redhat-operators-9wx28\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:04 crc kubenswrapper[4816]: I0311 12:43:04.856782 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:05 crc kubenswrapper[4816]: I0311 12:43:05.365790 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:05 crc kubenswrapper[4816]: I0311 12:43:05.574972 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerStarted","Data":"fc8d0a8c3b36be2e38db9a330d65c8910022292f8f5383e4d2bf40ab5bbc1f7a"} Mar 11 12:43:06 crc kubenswrapper[4816]: I0311 12:43:06.584741 4816 generic.go:334] "Generic (PLEG): container finished" podID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerID="04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b" exitCode=0 Mar 11 12:43:06 crc kubenswrapper[4816]: I0311 12:43:06.585680 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerDied","Data":"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b"} Mar 11 12:43:07 crc kubenswrapper[4816]: I0311 12:43:07.593800 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerStarted","Data":"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23"} Mar 11 12:43:08 crc kubenswrapper[4816]: I0311 12:43:08.606397 4816 generic.go:334] "Generic (PLEG): container finished" podID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerID="268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23" exitCode=0 Mar 11 12:43:08 crc kubenswrapper[4816]: I0311 12:43:08.606471 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerDied","Data":"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23"} Mar 11 12:43:09 crc kubenswrapper[4816]: I0311 12:43:09.616221 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerStarted","Data":"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc"} Mar 11 12:43:09 crc kubenswrapper[4816]: I0311 12:43:09.638930 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9wx28" podStartSLOduration=3.171722704 podStartE2EDuration="5.638905089s" podCreationTimestamp="2026-03-11 12:43:04 +0000 UTC" firstStartedPulling="2026-03-11 12:43:06.587015289 +0000 UTC m=+2673.178279296" lastFinishedPulling="2026-03-11 12:43:09.054197714 +0000 UTC m=+2675.645461681" observedRunningTime="2026-03-11 12:43:09.633606818 +0000 UTC m=+2676.224870785" watchObservedRunningTime="2026-03-11 12:43:09.638905089 +0000 UTC m=+2676.230169056" Mar 11 12:43:14 crc kubenswrapper[4816]: I0311 12:43:14.857752 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:14 crc kubenswrapper[4816]: I0311 12:43:14.858691 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:15 crc kubenswrapper[4816]: I0311 12:43:15.907235 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9wx28" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" probeResult="failure" output=< Mar 11 12:43:15 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:43:15 crc kubenswrapper[4816]: > Mar 11 12:43:24 crc kubenswrapper[4816]: I0311 12:43:24.911234 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:24 crc kubenswrapper[4816]: I0311 12:43:24.961428 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:25 crc kubenswrapper[4816]: I0311 12:43:25.162761 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:26 crc kubenswrapper[4816]: I0311 12:43:26.768870 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9wx28" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" containerID="cri-o://dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" gracePeriod=2 Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.180778 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.211862 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") pod \"63f54d4d-3bee-42aa-82f6-3149d37d9358\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.211951 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") pod \"63f54d4d-3bee-42aa-82f6-3149d37d9358\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.212020 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") pod \"63f54d4d-3bee-42aa-82f6-3149d37d9358\" (UID: \"63f54d4d-3bee-42aa-82f6-3149d37d9358\") " Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.213308 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities" (OuterVolumeSpecName: "utilities") pod "63f54d4d-3bee-42aa-82f6-3149d37d9358" (UID: "63f54d4d-3bee-42aa-82f6-3149d37d9358"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.213832 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.223183 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx" (OuterVolumeSpecName: "kube-api-access-k8nrx") pod "63f54d4d-3bee-42aa-82f6-3149d37d9358" (UID: "63f54d4d-3bee-42aa-82f6-3149d37d9358"). InnerVolumeSpecName "kube-api-access-k8nrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.314865 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8nrx\" (UniqueName: \"kubernetes.io/projected/63f54d4d-3bee-42aa-82f6-3149d37d9358-kube-api-access-k8nrx\") on node \"crc\" DevicePath \"\"" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.387072 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63f54d4d-3bee-42aa-82f6-3149d37d9358" (UID: "63f54d4d-3bee-42aa-82f6-3149d37d9358"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.417080 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f54d4d-3bee-42aa-82f6-3149d37d9358-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.780876 4816 generic.go:334] "Generic (PLEG): container finished" podID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerID="dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" exitCode=0 Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.780932 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerDied","Data":"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc"} Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.780977 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wx28" event={"ID":"63f54d4d-3bee-42aa-82f6-3149d37d9358","Type":"ContainerDied","Data":"fc8d0a8c3b36be2e38db9a330d65c8910022292f8f5383e4d2bf40ab5bbc1f7a"} Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.781001 4816 scope.go:117] "RemoveContainer" containerID="dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.780992 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wx28" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.804772 4816 scope.go:117] "RemoveContainer" containerID="268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.826709 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.838848 4816 scope.go:117] "RemoveContainer" containerID="04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.840343 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9wx28"] Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.866865 4816 scope.go:117] "RemoveContainer" containerID="dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" Mar 11 12:43:27 crc kubenswrapper[4816]: E0311 12:43:27.867643 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc\": container with ID starting with dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc not found: ID does not exist" containerID="dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.867737 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc"} err="failed to get container status \"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc\": rpc error: code = NotFound desc = could not find container \"dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc\": container with ID starting with dbf0373774db82b350f7c00f6be9ded2d015a35927d6c7b9bc0efdbc4f15dbdc not found: ID does not exist" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.867826 4816 scope.go:117] "RemoveContainer" containerID="268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23" Mar 11 12:43:27 crc kubenswrapper[4816]: E0311 12:43:27.868396 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23\": container with ID starting with 268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23 not found: ID does not exist" containerID="268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.868443 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23"} err="failed to get container status \"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23\": rpc error: code = NotFound desc = could not find container \"268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23\": container with ID starting with 268c82d42295b8241fbb9270a5b043fad5a8eb2cc3c87fd2faf6be0289be1a23 not found: ID does not exist" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.868476 4816 scope.go:117] "RemoveContainer" containerID="04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b" Mar 11 12:43:27 crc kubenswrapper[4816]: E0311 12:43:27.868858 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b\": container with ID starting with 04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b not found: ID does not exist" containerID="04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b" Mar 11 12:43:27 crc kubenswrapper[4816]: I0311 12:43:27.868905 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b"} err="failed to get container status \"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b\": rpc error: code = NotFound desc = could not find container \"04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b\": container with ID starting with 04b560f6ae79cb9e858339c4b90ffb7f6b2c5580871e4c7defac7d254bcfcf7b not found: ID does not exist" Mar 11 12:43:28 crc kubenswrapper[4816]: I0311 12:43:28.149868 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" path="/var/lib/kubelet/pods/63f54d4d-3bee-42aa-82f6-3149d37d9358/volumes" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.147905 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:44:00 crc kubenswrapper[4816]: E0311 12:44:00.148895 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.148914 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" Mar 11 12:44:00 crc kubenswrapper[4816]: E0311 12:44:00.148936 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="extract-content" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.148946 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="extract-content" Mar 11 12:44:00 crc kubenswrapper[4816]: E0311 12:44:00.148954 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="extract-utilities" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.148962 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="extract-utilities" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.149165 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f54d4d-3bee-42aa-82f6-3149d37d9358" containerName="registry-server" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.149778 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.151710 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.152062 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.152314 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.152799 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.325589 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") pod \"auto-csr-approver-29553884-t7cv9\" (UID: \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\") " pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.427139 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") pod \"auto-csr-approver-29553884-t7cv9\" (UID: \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\") " pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.446860 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") pod \"auto-csr-approver-29553884-t7cv9\" (UID: \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\") " pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.475692 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:00 crc kubenswrapper[4816]: I0311 12:44:00.884026 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:44:01 crc kubenswrapper[4816]: I0311 12:44:01.048591 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" event={"ID":"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7","Type":"ContainerStarted","Data":"e98daec571cc58b471b8d4238f5502acfec915e41235f877dcf0710ca15f8da9"} Mar 11 12:44:03 crc kubenswrapper[4816]: I0311 12:44:03.070884 4816 generic.go:334] "Generic (PLEG): container finished" podID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" containerID="da680962b6fbbd0e75bc32153dd7114d5c7dd1b60db6d2fbbedf1eb60245a10a" exitCode=0 Mar 11 12:44:03 crc kubenswrapper[4816]: I0311 12:44:03.071013 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" event={"ID":"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7","Type":"ContainerDied","Data":"da680962b6fbbd0e75bc32153dd7114d5c7dd1b60db6d2fbbedf1eb60245a10a"} Mar 11 12:44:04 crc kubenswrapper[4816]: I0311 12:44:04.394933 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:04 crc kubenswrapper[4816]: I0311 12:44:04.486459 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") pod \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\" (UID: \"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7\") " Mar 11 12:44:04 crc kubenswrapper[4816]: I0311 12:44:04.495241 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9" (OuterVolumeSpecName: "kube-api-access-wl6k9") pod "a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" (UID: "a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7"). InnerVolumeSpecName "kube-api-access-wl6k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:44:04 crc kubenswrapper[4816]: I0311 12:44:04.588282 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl6k9\" (UniqueName: \"kubernetes.io/projected/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7-kube-api-access-wl6k9\") on node \"crc\" DevicePath \"\"" Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.092213 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" event={"ID":"a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7","Type":"ContainerDied","Data":"e98daec571cc58b471b8d4238f5502acfec915e41235f877dcf0710ca15f8da9"} Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.092289 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e98daec571cc58b471b8d4238f5502acfec915e41235f877dcf0710ca15f8da9" Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.092361 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553884-t7cv9" Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.479293 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:44:05 crc kubenswrapper[4816]: I0311 12:44:05.489956 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553878-cp29z"] Mar 11 12:44:06 crc kubenswrapper[4816]: I0311 12:44:06.142877 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080a6096-67c3-41de-97d6-29a3f80c027e" path="/var/lib/kubelet/pods/080a6096-67c3-41de-97d6-29a3f80c027e/volumes" Mar 11 12:44:09 crc kubenswrapper[4816]: I0311 12:44:09.514726 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:44:09 crc kubenswrapper[4816]: I0311 12:44:09.515207 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:44:39 crc kubenswrapper[4816]: I0311 12:44:39.515130 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:44:39 crc kubenswrapper[4816]: I0311 12:44:39.515849 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:44:48 crc kubenswrapper[4816]: I0311 12:44:48.158968 4816 scope.go:117] "RemoveContainer" containerID="1e58ff6aa0fcfd993b576ab14b5750187bf29f39db71f93177979cba96a1d350" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.147837 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9"] Mar 11 12:45:00 crc kubenswrapper[4816]: E0311 12:45:00.151030 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" containerName="oc" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.151140 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" containerName="oc" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.151449 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" containerName="oc" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.152172 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.157106 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9"] Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.157449 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.157656 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.339325 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.339708 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.339797 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.441609 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.441663 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.441735 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.443207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.456851 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.459042 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") pod \"collect-profiles-29553885-k5rs9\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.474055 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:00 crc kubenswrapper[4816]: I0311 12:45:00.678309 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9"] Mar 11 12:45:01 crc kubenswrapper[4816]: I0311 12:45:01.582671 4816 generic.go:334] "Generic (PLEG): container finished" podID="b052f4a1-1fdd-4c5e-ba2d-0806387d6058" containerID="c9818458ccc0df3dcfe2c0eba17ad47fa9cb27149aa729ec7d3b86ff29db078f" exitCode=0 Mar 11 12:45:01 crc kubenswrapper[4816]: I0311 12:45:01.582848 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" event={"ID":"b052f4a1-1fdd-4c5e-ba2d-0806387d6058","Type":"ContainerDied","Data":"c9818458ccc0df3dcfe2c0eba17ad47fa9cb27149aa729ec7d3b86ff29db078f"} Mar 11 12:45:01 crc kubenswrapper[4816]: I0311 12:45:01.582897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" event={"ID":"b052f4a1-1fdd-4c5e-ba2d-0806387d6058","Type":"ContainerStarted","Data":"2c1af77f22809780fc23335dd2c2c3a4aeb0504ebb4cd7f236c74c8f7fa7f060"} Mar 11 12:45:02 crc kubenswrapper[4816]: I0311 12:45:02.883244 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.084194 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") pod \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.084425 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") pod \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.084601 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") pod \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\" (UID: \"b052f4a1-1fdd-4c5e-ba2d-0806387d6058\") " Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.086305 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume" (OuterVolumeSpecName: "config-volume") pod "b052f4a1-1fdd-4c5e-ba2d-0806387d6058" (UID: "b052f4a1-1fdd-4c5e-ba2d-0806387d6058"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.091563 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b052f4a1-1fdd-4c5e-ba2d-0806387d6058" (UID: "b052f4a1-1fdd-4c5e-ba2d-0806387d6058"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.091583 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt" (OuterVolumeSpecName: "kube-api-access-th9lt") pod "b052f4a1-1fdd-4c5e-ba2d-0806387d6058" (UID: "b052f4a1-1fdd-4c5e-ba2d-0806387d6058"). InnerVolumeSpecName "kube-api-access-th9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.187842 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.187887 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.187908 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th9lt\" (UniqueName: \"kubernetes.io/projected/b052f4a1-1fdd-4c5e-ba2d-0806387d6058-kube-api-access-th9lt\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.603083 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" event={"ID":"b052f4a1-1fdd-4c5e-ba2d-0806387d6058","Type":"ContainerDied","Data":"2c1af77f22809780fc23335dd2c2c3a4aeb0504ebb4cd7f236c74c8f7fa7f060"} Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.603588 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c1af77f22809780fc23335dd2c2c3a4aeb0504ebb4cd7f236c74c8f7fa7f060" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.603187 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553885-k5rs9" Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.959621 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:45:03 crc kubenswrapper[4816]: I0311 12:45:03.964532 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553840-xpb52"] Mar 11 12:45:04 crc kubenswrapper[4816]: I0311 12:45:04.144792 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c040a86-9614-48cb-9df7-14c83b046dce" path="/var/lib/kubelet/pods/3c040a86-9614-48cb-9df7-14c83b046dce/volumes" Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.515208 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.515648 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.515705 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.516307 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.516389 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211" gracePeriod=600 Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.663729 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211" exitCode=0 Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.664172 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211"} Mar 11 12:45:09 crc kubenswrapper[4816]: I0311 12:45:09.664226 4816 scope.go:117] "RemoveContainer" containerID="ec795cfa431d85ddfdf7e69a0e14b960b29a72798d4d8c2c1a02737857cdd1ad" Mar 11 12:45:10 crc kubenswrapper[4816]: I0311 12:45:10.674213 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334"} Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.813888 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:12 crc kubenswrapper[4816]: E0311 12:45:12.814864 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b052f4a1-1fdd-4c5e-ba2d-0806387d6058" containerName="collect-profiles" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.814883 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="b052f4a1-1fdd-4c5e-ba2d-0806387d6058" containerName="collect-profiles" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.815094 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="b052f4a1-1fdd-4c5e-ba2d-0806387d6058" containerName="collect-profiles" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.816210 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.825587 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.852332 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.852419 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.852481 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.954901 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.955062 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.955090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.955661 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.955933 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:12 crc kubenswrapper[4816]: I0311 12:45:12.983294 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") pod \"community-operators-lbw7s\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.149838 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.459564 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.704065 4816 generic.go:334] "Generic (PLEG): container finished" podID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerID="cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2" exitCode=0 Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.704160 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerDied","Data":"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2"} Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.704512 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerStarted","Data":"77560a17df15c39d5c9ac619eeb26bfb46708b2b82cc785833190cee1e64b1cb"} Mar 11 12:45:13 crc kubenswrapper[4816]: I0311 12:45:13.706304 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:45:14 crc kubenswrapper[4816]: I0311 12:45:14.718554 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerStarted","Data":"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f"} Mar 11 12:45:15 crc kubenswrapper[4816]: I0311 12:45:15.729239 4816 generic.go:334] "Generic (PLEG): container finished" podID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerID="798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f" exitCode=0 Mar 11 12:45:15 crc kubenswrapper[4816]: I0311 12:45:15.729376 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerDied","Data":"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f"} Mar 11 12:45:16 crc kubenswrapper[4816]: I0311 12:45:16.745865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerStarted","Data":"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620"} Mar 11 12:45:16 crc kubenswrapper[4816]: I0311 12:45:16.770578 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lbw7s" podStartSLOduration=2.178939252 podStartE2EDuration="4.770551701s" podCreationTimestamp="2026-03-11 12:45:12 +0000 UTC" firstStartedPulling="2026-03-11 12:45:13.706007308 +0000 UTC m=+2800.297271275" lastFinishedPulling="2026-03-11 12:45:16.297619707 +0000 UTC m=+2802.888883724" observedRunningTime="2026-03-11 12:45:16.765546798 +0000 UTC m=+2803.356810805" watchObservedRunningTime="2026-03-11 12:45:16.770551701 +0000 UTC m=+2803.361815678" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.150794 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.151633 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.209545 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.886736 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:23 crc kubenswrapper[4816]: I0311 12:45:23.979536 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:25 crc kubenswrapper[4816]: I0311 12:45:25.836144 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lbw7s" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="registry-server" containerID="cri-o://598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" gracePeriod=2 Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.290766 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.389690 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") pod \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.389829 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") pod \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.389882 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") pod \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\" (UID: \"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff\") " Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.390920 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities" (OuterVolumeSpecName: "utilities") pod "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" (UID: "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.398554 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs" (OuterVolumeSpecName: "kube-api-access-9nprs") pod "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" (UID: "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff"). InnerVolumeSpecName "kube-api-access-9nprs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.466371 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" (UID: "ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.491657 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.491698 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.491716 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nprs\" (UniqueName: \"kubernetes.io/projected/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff-kube-api-access-9nprs\") on node \"crc\" DevicePath \"\"" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851510 4816 generic.go:334] "Generic (PLEG): container finished" podID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerID="598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" exitCode=0 Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851565 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerDied","Data":"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620"} Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851609 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lbw7s" event={"ID":"ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff","Type":"ContainerDied","Data":"77560a17df15c39d5c9ac619eeb26bfb46708b2b82cc785833190cee1e64b1cb"} Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851633 4816 scope.go:117] "RemoveContainer" containerID="598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.851631 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lbw7s" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.873949 4816 scope.go:117] "RemoveContainer" containerID="798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.892301 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.899098 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lbw7s"] Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.906923 4816 scope.go:117] "RemoveContainer" containerID="cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.942328 4816 scope.go:117] "RemoveContainer" containerID="598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" Mar 11 12:45:26 crc kubenswrapper[4816]: E0311 12:45:26.942894 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620\": container with ID starting with 598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620 not found: ID does not exist" containerID="598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.942943 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620"} err="failed to get container status \"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620\": rpc error: code = NotFound desc = could not find container \"598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620\": container with ID starting with 598fcc6fc72c2e1642ebf254197361aa5fd99c2cb40b199afa6c0d2b53561620 not found: ID does not exist" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.942976 4816 scope.go:117] "RemoveContainer" containerID="798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f" Mar 11 12:45:26 crc kubenswrapper[4816]: E0311 12:45:26.943312 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f\": container with ID starting with 798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f not found: ID does not exist" containerID="798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.943348 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f"} err="failed to get container status \"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f\": rpc error: code = NotFound desc = could not find container \"798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f\": container with ID starting with 798a71db4dec171296cbd4c600d9d77fe39747501f2ac21333eaf9a918397d8f not found: ID does not exist" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.943365 4816 scope.go:117] "RemoveContainer" containerID="cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2" Mar 11 12:45:26 crc kubenswrapper[4816]: E0311 12:45:26.943631 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2\": container with ID starting with cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2 not found: ID does not exist" containerID="cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2" Mar 11 12:45:26 crc kubenswrapper[4816]: I0311 12:45:26.943661 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2"} err="failed to get container status \"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2\": rpc error: code = NotFound desc = could not find container \"cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2\": container with ID starting with cec52d6249bd59cc5a2ddeff07252de962b58d52a1caf5d7b7a503e8bb8c59d2 not found: ID does not exist" Mar 11 12:45:28 crc kubenswrapper[4816]: I0311 12:45:28.144588 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" path="/var/lib/kubelet/pods/ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff/volumes" Mar 11 12:45:48 crc kubenswrapper[4816]: I0311 12:45:48.247427 4816 scope.go:117] "RemoveContainer" containerID="f3bda5d4e49a815a926b2f32c60f3932a76a7181a017078bc20f79926bfbf6a6" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.153836 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:46:00 crc kubenswrapper[4816]: E0311 12:46:00.155521 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="extract-utilities" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.155539 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="extract-utilities" Mar 11 12:46:00 crc kubenswrapper[4816]: E0311 12:46:00.155554 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="extract-content" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.155564 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="extract-content" Mar 11 12:46:00 crc kubenswrapper[4816]: E0311 12:46:00.155576 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="registry-server" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.155582 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="registry-server" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.155734 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed19e2d2-40f6-4cc4-a35b-3d557fbd7aff" containerName="registry-server" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.156266 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.158438 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.160775 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.163323 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.164066 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.305947 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") pod \"auto-csr-approver-29553886-xjtpg\" (UID: \"10793112-fba0-46e4-a3a5-201255a72221\") " pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.409182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") pod \"auto-csr-approver-29553886-xjtpg\" (UID: \"10793112-fba0-46e4-a3a5-201255a72221\") " pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.444076 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") pod \"auto-csr-approver-29553886-xjtpg\" (UID: \"10793112-fba0-46e4-a3a5-201255a72221\") " pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.481617 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:00 crc kubenswrapper[4816]: I0311 12:46:00.735780 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:46:01 crc kubenswrapper[4816]: I0311 12:46:01.155684 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" event={"ID":"10793112-fba0-46e4-a3a5-201255a72221","Type":"ContainerStarted","Data":"9a1ea2f5c8b9c9d2168bc35813243f641fb2fd4e1b93f51a50c3ed629bd728cd"} Mar 11 12:46:02 crc kubenswrapper[4816]: I0311 12:46:02.170046 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" event={"ID":"10793112-fba0-46e4-a3a5-201255a72221","Type":"ContainerStarted","Data":"015da5072b60f8b74ef45c6695076cb6f089d3139f01fe3ef7f7d86d8236f381"} Mar 11 12:46:02 crc kubenswrapper[4816]: I0311 12:46:02.193202 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" podStartSLOduration=1.2703352749999999 podStartE2EDuration="2.193172663s" podCreationTimestamp="2026-03-11 12:46:00 +0000 UTC" firstStartedPulling="2026-03-11 12:46:00.744325721 +0000 UTC m=+2847.335589688" lastFinishedPulling="2026-03-11 12:46:01.667163109 +0000 UTC m=+2848.258427076" observedRunningTime="2026-03-11 12:46:02.186568114 +0000 UTC m=+2848.777832101" watchObservedRunningTime="2026-03-11 12:46:02.193172663 +0000 UTC m=+2848.784436650" Mar 11 12:46:03 crc kubenswrapper[4816]: I0311 12:46:03.182040 4816 generic.go:334] "Generic (PLEG): container finished" podID="10793112-fba0-46e4-a3a5-201255a72221" containerID="015da5072b60f8b74ef45c6695076cb6f089d3139f01fe3ef7f7d86d8236f381" exitCode=0 Mar 11 12:46:03 crc kubenswrapper[4816]: I0311 12:46:03.182128 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" event={"ID":"10793112-fba0-46e4-a3a5-201255a72221","Type":"ContainerDied","Data":"015da5072b60f8b74ef45c6695076cb6f089d3139f01fe3ef7f7d86d8236f381"} Mar 11 12:46:04 crc kubenswrapper[4816]: I0311 12:46:04.524708 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:04 crc kubenswrapper[4816]: I0311 12:46:04.683641 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") pod \"10793112-fba0-46e4-a3a5-201255a72221\" (UID: \"10793112-fba0-46e4-a3a5-201255a72221\") " Mar 11 12:46:04 crc kubenswrapper[4816]: I0311 12:46:04.693793 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444" (OuterVolumeSpecName: "kube-api-access-sx444") pod "10793112-fba0-46e4-a3a5-201255a72221" (UID: "10793112-fba0-46e4-a3a5-201255a72221"). InnerVolumeSpecName "kube-api-access-sx444". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:46:04 crc kubenswrapper[4816]: I0311 12:46:04.785478 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx444\" (UniqueName: \"kubernetes.io/projected/10793112-fba0-46e4-a3a5-201255a72221-kube-api-access-sx444\") on node \"crc\" DevicePath \"\"" Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.202163 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" event={"ID":"10793112-fba0-46e4-a3a5-201255a72221","Type":"ContainerDied","Data":"9a1ea2f5c8b9c9d2168bc35813243f641fb2fd4e1b93f51a50c3ed629bd728cd"} Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.202215 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a1ea2f5c8b9c9d2168bc35813243f641fb2fd4e1b93f51a50c3ed629bd728cd" Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.202332 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553886-xjtpg" Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.275112 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:46:05 crc kubenswrapper[4816]: I0311 12:46:05.282387 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553880-9ks6n"] Mar 11 12:46:06 crc kubenswrapper[4816]: I0311 12:46:06.143686 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e47f911d-5bf4-4923-a6ac-95e98217fd25" path="/var/lib/kubelet/pods/e47f911d-5bf4-4923-a6ac-95e98217fd25/volumes" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.476941 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:35 crc kubenswrapper[4816]: E0311 12:46:35.477875 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10793112-fba0-46e4-a3a5-201255a72221" containerName="oc" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.477887 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="10793112-fba0-46e4-a3a5-201255a72221" containerName="oc" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.478090 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="10793112-fba0-46e4-a3a5-201255a72221" containerName="oc" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.479126 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.498492 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.564221 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.564322 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.564382 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.665320 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.665418 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.665470 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.666371 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.666427 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.689808 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") pod \"redhat-marketplace-4zzg5\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:35 crc kubenswrapper[4816]: I0311 12:46:35.797435 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:36 crc kubenswrapper[4816]: I0311 12:46:36.241929 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:36 crc kubenswrapper[4816]: I0311 12:46:36.454258 4816 generic.go:334] "Generic (PLEG): container finished" podID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerID="f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b" exitCode=0 Mar 11 12:46:36 crc kubenswrapper[4816]: I0311 12:46:36.454534 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerDied","Data":"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b"} Mar 11 12:46:36 crc kubenswrapper[4816]: I0311 12:46:36.454575 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerStarted","Data":"d12981d070039e7ac78396e563a248bc2a041f1a7cbd0cdc1c1468d7a68e045e"} Mar 11 12:46:38 crc kubenswrapper[4816]: I0311 12:46:38.473069 4816 generic.go:334] "Generic (PLEG): container finished" podID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerID="45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6" exitCode=0 Mar 11 12:46:38 crc kubenswrapper[4816]: I0311 12:46:38.473133 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerDied","Data":"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6"} Mar 11 12:46:39 crc kubenswrapper[4816]: I0311 12:46:39.483849 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerStarted","Data":"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37"} Mar 11 12:46:39 crc kubenswrapper[4816]: I0311 12:46:39.506896 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4zzg5" podStartSLOduration=1.973589633 podStartE2EDuration="4.506861611s" podCreationTimestamp="2026-03-11 12:46:35 +0000 UTC" firstStartedPulling="2026-03-11 12:46:36.47103144 +0000 UTC m=+2883.062295397" lastFinishedPulling="2026-03-11 12:46:39.004303408 +0000 UTC m=+2885.595567375" observedRunningTime="2026-03-11 12:46:39.500676774 +0000 UTC m=+2886.091940741" watchObservedRunningTime="2026-03-11 12:46:39.506861611 +0000 UTC m=+2886.098125578" Mar 11 12:46:45 crc kubenswrapper[4816]: I0311 12:46:45.798463 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:45 crc kubenswrapper[4816]: I0311 12:46:45.799144 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:45 crc kubenswrapper[4816]: I0311 12:46:45.841087 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:46 crc kubenswrapper[4816]: I0311 12:46:46.610574 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:46 crc kubenswrapper[4816]: I0311 12:46:46.670628 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:48 crc kubenswrapper[4816]: I0311 12:46:48.322748 4816 scope.go:117] "RemoveContainer" containerID="f2c8244acc6c0aed95c31a23f8089006b34d5b7db0dcdad32b1e6365dd4fd124" Mar 11 12:46:48 crc kubenswrapper[4816]: I0311 12:46:48.557390 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4zzg5" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="registry-server" containerID="cri-o://e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" gracePeriod=2 Mar 11 12:46:48 crc kubenswrapper[4816]: I0311 12:46:48.956367 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.064809 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") pod \"0d271092-56ec-48d0-91cc-8aab4b87d282\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.064933 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") pod \"0d271092-56ec-48d0-91cc-8aab4b87d282\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.065076 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") pod \"0d271092-56ec-48d0-91cc-8aab4b87d282\" (UID: \"0d271092-56ec-48d0-91cc-8aab4b87d282\") " Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.066213 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities" (OuterVolumeSpecName: "utilities") pod "0d271092-56ec-48d0-91cc-8aab4b87d282" (UID: "0d271092-56ec-48d0-91cc-8aab4b87d282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.075705 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf" (OuterVolumeSpecName: "kube-api-access-pjbkf") pod "0d271092-56ec-48d0-91cc-8aab4b87d282" (UID: "0d271092-56ec-48d0-91cc-8aab4b87d282"). InnerVolumeSpecName "kube-api-access-pjbkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.093014 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d271092-56ec-48d0-91cc-8aab4b87d282" (UID: "0d271092-56ec-48d0-91cc-8aab4b87d282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.166837 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjbkf\" (UniqueName: \"kubernetes.io/projected/0d271092-56ec-48d0-91cc-8aab4b87d282-kube-api-access-pjbkf\") on node \"crc\" DevicePath \"\"" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.167186 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.167274 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d271092-56ec-48d0-91cc-8aab4b87d282-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.566975 4816 generic.go:334] "Generic (PLEG): container finished" podID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerID="e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" exitCode=0 Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.567039 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerDied","Data":"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37"} Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.567072 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zzg5" event={"ID":"0d271092-56ec-48d0-91cc-8aab4b87d282","Type":"ContainerDied","Data":"d12981d070039e7ac78396e563a248bc2a041f1a7cbd0cdc1c1468d7a68e045e"} Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.567095 4816 scope.go:117] "RemoveContainer" containerID="e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.567299 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zzg5" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.585290 4816 scope.go:117] "RemoveContainer" containerID="45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.598951 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.606127 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zzg5"] Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.626067 4816 scope.go:117] "RemoveContainer" containerID="f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.647530 4816 scope.go:117] "RemoveContainer" containerID="e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" Mar 11 12:46:49 crc kubenswrapper[4816]: E0311 12:46:49.648053 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37\": container with ID starting with e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37 not found: ID does not exist" containerID="e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648082 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37"} err="failed to get container status \"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37\": rpc error: code = NotFound desc = could not find container \"e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37\": container with ID starting with e6418086a14d782aacd990a6d4c75501a1835fefb3809c2d1c6ed5f684d82f37 not found: ID does not exist" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648118 4816 scope.go:117] "RemoveContainer" containerID="45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6" Mar 11 12:46:49 crc kubenswrapper[4816]: E0311 12:46:49.648530 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6\": container with ID starting with 45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6 not found: ID does not exist" containerID="45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648547 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6"} err="failed to get container status \"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6\": rpc error: code = NotFound desc = could not find container \"45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6\": container with ID starting with 45ccac66208f152eef9f7de56486330f3b49767793c8f3e6f8573af0a43235b6 not found: ID does not exist" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648560 4816 scope.go:117] "RemoveContainer" containerID="f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b" Mar 11 12:46:49 crc kubenswrapper[4816]: E0311 12:46:49.648821 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b\": container with ID starting with f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b not found: ID does not exist" containerID="f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b" Mar 11 12:46:49 crc kubenswrapper[4816]: I0311 12:46:49.648839 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b"} err="failed to get container status \"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b\": rpc error: code = NotFound desc = could not find container \"f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b\": container with ID starting with f6b2d977c9a8f96c182b5aea52589da6a3ff9423af84bdd7258a6894677a528b not found: ID does not exist" Mar 11 12:46:50 crc kubenswrapper[4816]: I0311 12:46:50.146503 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" path="/var/lib/kubelet/pods/0d271092-56ec-48d0-91cc-8aab4b87d282/volumes" Mar 11 12:47:09 crc kubenswrapper[4816]: I0311 12:47:09.515582 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:47:09 crc kubenswrapper[4816]: I0311 12:47:09.517621 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:47:39 crc kubenswrapper[4816]: I0311 12:47:39.514717 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:47:39 crc kubenswrapper[4816]: I0311 12:47:39.515382 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.153235 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:48:00 crc kubenswrapper[4816]: E0311 12:48:00.154223 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="extract-content" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.154270 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="extract-content" Mar 11 12:48:00 crc kubenswrapper[4816]: E0311 12:48:00.154305 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="extract-utilities" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.154318 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="extract-utilities" Mar 11 12:48:00 crc kubenswrapper[4816]: E0311 12:48:00.154358 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="registry-server" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.154372 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="registry-server" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.154582 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d271092-56ec-48d0-91cc-8aab4b87d282" containerName="registry-server" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.155379 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.160858 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.160995 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.161147 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.161995 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.289401 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") pod \"auto-csr-approver-29553888-7sw8h\" (UID: \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\") " pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.391540 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") pod \"auto-csr-approver-29553888-7sw8h\" (UID: \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\") " pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.410589 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") pod \"auto-csr-approver-29553888-7sw8h\" (UID: \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\") " pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.484572 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:00 crc kubenswrapper[4816]: I0311 12:48:00.954395 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:48:01 crc kubenswrapper[4816]: I0311 12:48:01.225508 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" event={"ID":"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177","Type":"ContainerStarted","Data":"37e212f532f9285ddc8c1a39daa606e585f77369c6470ebdc810b0fab9c9bb7c"} Mar 11 12:48:03 crc kubenswrapper[4816]: I0311 12:48:03.246901 4816 generic.go:334] "Generic (PLEG): container finished" podID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" containerID="9ed1a5e43552ff0476bd301f6f56de7c0e4f936f582bd894ea6e5569ba2db74d" exitCode=0 Mar 11 12:48:03 crc kubenswrapper[4816]: I0311 12:48:03.246996 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" event={"ID":"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177","Type":"ContainerDied","Data":"9ed1a5e43552ff0476bd301f6f56de7c0e4f936f582bd894ea6e5569ba2db74d"} Mar 11 12:48:04 crc kubenswrapper[4816]: I0311 12:48:04.635336 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:04 crc kubenswrapper[4816]: I0311 12:48:04.698809 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") pod \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\" (UID: \"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177\") " Mar 11 12:48:04 crc kubenswrapper[4816]: I0311 12:48:04.707870 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn" (OuterVolumeSpecName: "kube-api-access-h7fcn") pod "f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" (UID: "f3a5e0fe-c52b-4b6f-ab13-ba73fce64177"). InnerVolumeSpecName "kube-api-access-h7fcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:48:04 crc kubenswrapper[4816]: I0311 12:48:04.801230 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7fcn\" (UniqueName: \"kubernetes.io/projected/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177-kube-api-access-h7fcn\") on node \"crc\" DevicePath \"\"" Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.272746 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.272686 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553888-7sw8h" event={"ID":"f3a5e0fe-c52b-4b6f-ab13-ba73fce64177","Type":"ContainerDied","Data":"37e212f532f9285ddc8c1a39daa606e585f77369c6470ebdc810b0fab9c9bb7c"} Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.273031 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37e212f532f9285ddc8c1a39daa606e585f77369c6470ebdc810b0fab9c9bb7c" Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.728740 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:48:05 crc kubenswrapper[4816]: I0311 12:48:05.737078 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553882-ggqkk"] Mar 11 12:48:06 crc kubenswrapper[4816]: I0311 12:48:06.140727 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a88e2af-5e7d-4491-a32f-75a670aed689" path="/var/lib/kubelet/pods/5a88e2af-5e7d-4491-a32f-75a670aed689/volumes" Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.515895 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.516587 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.516656 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.517567 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:48:09 crc kubenswrapper[4816]: I0311 12:48:09.517649 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" gracePeriod=600 Mar 11 12:48:09 crc kubenswrapper[4816]: E0311 12:48:09.643213 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:10 crc kubenswrapper[4816]: I0311 12:48:10.331642 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" exitCode=0 Mar 11 12:48:10 crc kubenswrapper[4816]: I0311 12:48:10.331723 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334"} Mar 11 12:48:10 crc kubenswrapper[4816]: I0311 12:48:10.331814 4816 scope.go:117] "RemoveContainer" containerID="e94d54c6dd2b7a4e577e03c8b08cf5eb1a8a362732b731a0d82ddf5cdc9d6211" Mar 11 12:48:10 crc kubenswrapper[4816]: I0311 12:48:10.332724 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:10 crc kubenswrapper[4816]: E0311 12:48:10.333297 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:21 crc kubenswrapper[4816]: I0311 12:48:21.131174 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:21 crc kubenswrapper[4816]: E0311 12:48:21.132349 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:32 crc kubenswrapper[4816]: I0311 12:48:32.131224 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:32 crc kubenswrapper[4816]: E0311 12:48:32.132290 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:44 crc kubenswrapper[4816]: I0311 12:48:44.143330 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:44 crc kubenswrapper[4816]: E0311 12:48:44.144369 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.650354 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-82vz9"] Mar 11 12:48:46 crc kubenswrapper[4816]: E0311 12:48:46.650969 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" containerName="oc" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.651006 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" containerName="oc" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.651400 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" containerName="oc" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.654655 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.667583 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82vz9"] Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.833084 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-utilities\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.833188 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-catalog-content\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.833219 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgptm\" (UniqueName: \"kubernetes.io/projected/2b81c3bf-499d-48bd-869b-671fefa1ba81-kube-api-access-dgptm\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.934497 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-catalog-content\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.934877 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgptm\" (UniqueName: \"kubernetes.io/projected/2b81c3bf-499d-48bd-869b-671fefa1ba81-kube-api-access-dgptm\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.934965 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-utilities\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.935916 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-catalog-content\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.935972 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b81c3bf-499d-48bd-869b-671fefa1ba81-utilities\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.962649 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgptm\" (UniqueName: \"kubernetes.io/projected/2b81c3bf-499d-48bd-869b-671fefa1ba81-kube-api-access-dgptm\") pod \"certified-operators-82vz9\" (UID: \"2b81c3bf-499d-48bd-869b-671fefa1ba81\") " pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:46 crc kubenswrapper[4816]: I0311 12:48:46.986186 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:47 crc kubenswrapper[4816]: I0311 12:48:47.280354 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82vz9"] Mar 11 12:48:47 crc kubenswrapper[4816]: I0311 12:48:47.664901 4816 generic.go:334] "Generic (PLEG): container finished" podID="2b81c3bf-499d-48bd-869b-671fefa1ba81" containerID="ed8a3257cbd0ffba17d5ca43261f294047c9f3f8158d95c91ccd21053221242c" exitCode=0 Mar 11 12:48:47 crc kubenswrapper[4816]: I0311 12:48:47.664974 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82vz9" event={"ID":"2b81c3bf-499d-48bd-869b-671fefa1ba81","Type":"ContainerDied","Data":"ed8a3257cbd0ffba17d5ca43261f294047c9f3f8158d95c91ccd21053221242c"} Mar 11 12:48:47 crc kubenswrapper[4816]: I0311 12:48:47.665053 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82vz9" event={"ID":"2b81c3bf-499d-48bd-869b-671fefa1ba81","Type":"ContainerStarted","Data":"60a4e46c53d3c2b51ed6586a5baf5e215ae1a345c51ceb946a1a2ab500544aa7"} Mar 11 12:48:48 crc kubenswrapper[4816]: I0311 12:48:48.424662 4816 scope.go:117] "RemoveContainer" containerID="c588ba0a9276d85151be0b86106d7b0f7a77bf5bc78e6ea0213f1a19b8ad671f" Mar 11 12:48:52 crc kubenswrapper[4816]: I0311 12:48:52.704469 4816 generic.go:334] "Generic (PLEG): container finished" podID="2b81c3bf-499d-48bd-869b-671fefa1ba81" containerID="607956a270a77817bcccd9307e1598f9d1114a7f70ea08d76b9c9c5dbadf188e" exitCode=0 Mar 11 12:48:52 crc kubenswrapper[4816]: I0311 12:48:52.704604 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82vz9" event={"ID":"2b81c3bf-499d-48bd-869b-671fefa1ba81","Type":"ContainerDied","Data":"607956a270a77817bcccd9307e1598f9d1114a7f70ea08d76b9c9c5dbadf188e"} Mar 11 12:48:53 crc kubenswrapper[4816]: I0311 12:48:53.718170 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82vz9" event={"ID":"2b81c3bf-499d-48bd-869b-671fefa1ba81","Type":"ContainerStarted","Data":"487c087a8e1036366e59815f08b96a73c3dd59e78d6a2028fce0c092499692a1"} Mar 11 12:48:53 crc kubenswrapper[4816]: I0311 12:48:53.750477 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-82vz9" podStartSLOduration=2.222062042 podStartE2EDuration="7.750448805s" podCreationTimestamp="2026-03-11 12:48:46 +0000 UTC" firstStartedPulling="2026-03-11 12:48:47.66645674 +0000 UTC m=+3014.257720707" lastFinishedPulling="2026-03-11 12:48:53.194843463 +0000 UTC m=+3019.786107470" observedRunningTime="2026-03-11 12:48:53.744089483 +0000 UTC m=+3020.335353470" watchObservedRunningTime="2026-03-11 12:48:53.750448805 +0000 UTC m=+3020.341712772" Mar 11 12:48:56 crc kubenswrapper[4816]: I0311 12:48:56.987197 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:56 crc kubenswrapper[4816]: I0311 12:48:56.987683 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:57 crc kubenswrapper[4816]: I0311 12:48:57.057812 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:48:58 crc kubenswrapper[4816]: I0311 12:48:58.131882 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:48:58 crc kubenswrapper[4816]: E0311 12:48:58.132983 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.026928 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-82vz9" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.103538 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82vz9"] Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.204168 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.204518 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wlx2d" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="registry-server" containerID="cri-o://034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" gracePeriod=2 Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.591597 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.774435 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") pod \"d456b988-0480-49fc-9667-03c56b871abe\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.774615 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") pod \"d456b988-0480-49fc-9667-03c56b871abe\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.774723 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") pod \"d456b988-0480-49fc-9667-03c56b871abe\" (UID: \"d456b988-0480-49fc-9667-03c56b871abe\") " Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.774947 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities" (OuterVolumeSpecName: "utilities") pod "d456b988-0480-49fc-9667-03c56b871abe" (UID: "d456b988-0480-49fc-9667-03c56b871abe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.775103 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.781985 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf" (OuterVolumeSpecName: "kube-api-access-kfctf") pod "d456b988-0480-49fc-9667-03c56b871abe" (UID: "d456b988-0480-49fc-9667-03c56b871abe"). InnerVolumeSpecName "kube-api-access-kfctf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.830627 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d456b988-0480-49fc-9667-03c56b871abe" (UID: "d456b988-0480-49fc-9667-03c56b871abe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842005 4816 generic.go:334] "Generic (PLEG): container finished" podID="d456b988-0480-49fc-9667-03c56b871abe" containerID="034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" exitCode=0 Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842041 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerDied","Data":"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f"} Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842095 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlx2d" event={"ID":"d456b988-0480-49fc-9667-03c56b871abe","Type":"ContainerDied","Data":"e45e20ab411467d564ca8cae5d6389cdfcd6bd45b4eceaf7b963fe5e1fca9258"} Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842100 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlx2d" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.842121 4816 scope.go:117] "RemoveContainer" containerID="034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.867506 4816 scope.go:117] "RemoveContainer" containerID="6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.876355 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfctf\" (UniqueName: \"kubernetes.io/projected/d456b988-0480-49fc-9667-03c56b871abe-kube-api-access-kfctf\") on node \"crc\" DevicePath \"\"" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.876398 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d456b988-0480-49fc-9667-03c56b871abe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.880364 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.887072 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wlx2d"] Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.913692 4816 scope.go:117] "RemoveContainer" containerID="a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.933203 4816 scope.go:117] "RemoveContainer" containerID="034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" Mar 11 12:49:07 crc kubenswrapper[4816]: E0311 12:49:07.933791 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f\": container with ID starting with 034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f not found: ID does not exist" containerID="034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.933828 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f"} err="failed to get container status \"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f\": rpc error: code = NotFound desc = could not find container \"034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f\": container with ID starting with 034af007a8b56505d39ecedb921ade4c270666769f440b8289cd57067df8758f not found: ID does not exist" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.933860 4816 scope.go:117] "RemoveContainer" containerID="6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60" Mar 11 12:49:07 crc kubenswrapper[4816]: E0311 12:49:07.934058 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60\": container with ID starting with 6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60 not found: ID does not exist" containerID="6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.934081 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60"} err="failed to get container status \"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60\": rpc error: code = NotFound desc = could not find container \"6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60\": container with ID starting with 6a05710b8cde6897fca0c639bd670ffbfb46d506d87933dfb04ba42f513fcb60 not found: ID does not exist" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.934096 4816 scope.go:117] "RemoveContainer" containerID="a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf" Mar 11 12:49:07 crc kubenswrapper[4816]: E0311 12:49:07.934305 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf\": container with ID starting with a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf not found: ID does not exist" containerID="a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf" Mar 11 12:49:07 crc kubenswrapper[4816]: I0311 12:49:07.934333 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf"} err="failed to get container status \"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf\": rpc error: code = NotFound desc = could not find container \"a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf\": container with ID starting with a3fc9fd4ff3edc6c5e4c1aac00b48ef8939da87b568fc529a3373bc8dbd6d7bf not found: ID does not exist" Mar 11 12:49:08 crc kubenswrapper[4816]: I0311 12:49:08.139284 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d456b988-0480-49fc-9667-03c56b871abe" path="/var/lib/kubelet/pods/d456b988-0480-49fc-9667-03c56b871abe/volumes" Mar 11 12:49:12 crc kubenswrapper[4816]: I0311 12:49:12.130835 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:49:12 crc kubenswrapper[4816]: E0311 12:49:12.131851 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:49:24 crc kubenswrapper[4816]: I0311 12:49:24.130827 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:49:24 crc kubenswrapper[4816]: E0311 12:49:24.131585 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:49:37 crc kubenswrapper[4816]: I0311 12:49:37.131684 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:49:37 crc kubenswrapper[4816]: E0311 12:49:37.133402 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:49:51 crc kubenswrapper[4816]: I0311 12:49:51.131764 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:49:51 crc kubenswrapper[4816]: E0311 12:49:51.132997 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.158515 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:50:00 crc kubenswrapper[4816]: E0311 12:50:00.159517 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="extract-utilities" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.159540 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="extract-utilities" Mar 11 12:50:00 crc kubenswrapper[4816]: E0311 12:50:00.159591 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="extract-content" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.159603 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="extract-content" Mar 11 12:50:00 crc kubenswrapper[4816]: E0311 12:50:00.159630 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="registry-server" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.159641 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="registry-server" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.159834 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d456b988-0480-49fc-9667-03c56b871abe" containerName="registry-server" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.160492 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.163471 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.163587 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.163700 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.167097 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.268060 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") pod \"auto-csr-approver-29553890-4jjhs\" (UID: \"3ba83d28-3266-48ec-a66b-256e07e427c4\") " pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.368885 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") pod \"auto-csr-approver-29553890-4jjhs\" (UID: \"3ba83d28-3266-48ec-a66b-256e07e427c4\") " pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.390109 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") pod \"auto-csr-approver-29553890-4jjhs\" (UID: \"3ba83d28-3266-48ec-a66b-256e07e427c4\") " pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.483290 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:00 crc kubenswrapper[4816]: I0311 12:50:00.905355 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:50:01 crc kubenswrapper[4816]: I0311 12:50:01.303774 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" event={"ID":"3ba83d28-3266-48ec-a66b-256e07e427c4","Type":"ContainerStarted","Data":"276f6ef42121ac14bfdd12c54816b043fcc5d41dc9444b9b53c34e8ce6a3b4d0"} Mar 11 12:50:03 crc kubenswrapper[4816]: I0311 12:50:03.321791 4816 generic.go:334] "Generic (PLEG): container finished" podID="3ba83d28-3266-48ec-a66b-256e07e427c4" containerID="24d57408e6d94ff8c7de8f3b9883efb12d44570ac11551e03845bb25056d71b0" exitCode=0 Mar 11 12:50:03 crc kubenswrapper[4816]: I0311 12:50:03.321908 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" event={"ID":"3ba83d28-3266-48ec-a66b-256e07e427c4","Type":"ContainerDied","Data":"24d57408e6d94ff8c7de8f3b9883efb12d44570ac11551e03845bb25056d71b0"} Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.136391 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:04 crc kubenswrapper[4816]: E0311 12:50:04.137075 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.647441 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.838272 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") pod \"3ba83d28-3266-48ec-a66b-256e07e427c4\" (UID: \"3ba83d28-3266-48ec-a66b-256e07e427c4\") " Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.848499 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb" (OuterVolumeSpecName: "kube-api-access-47jtb") pod "3ba83d28-3266-48ec-a66b-256e07e427c4" (UID: "3ba83d28-3266-48ec-a66b-256e07e427c4"). InnerVolumeSpecName "kube-api-access-47jtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:50:04 crc kubenswrapper[4816]: I0311 12:50:04.940273 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47jtb\" (UniqueName: \"kubernetes.io/projected/3ba83d28-3266-48ec-a66b-256e07e427c4-kube-api-access-47jtb\") on node \"crc\" DevicePath \"\"" Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.342836 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" event={"ID":"3ba83d28-3266-48ec-a66b-256e07e427c4","Type":"ContainerDied","Data":"276f6ef42121ac14bfdd12c54816b043fcc5d41dc9444b9b53c34e8ce6a3b4d0"} Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.342894 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="276f6ef42121ac14bfdd12c54816b043fcc5d41dc9444b9b53c34e8ce6a3b4d0" Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.342910 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553890-4jjhs" Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.727358 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:50:05 crc kubenswrapper[4816]: I0311 12:50:05.736756 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553884-t7cv9"] Mar 11 12:50:06 crc kubenswrapper[4816]: I0311 12:50:06.141054 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7" path="/var/lib/kubelet/pods/a8c87e3f-c7ea-4b63-bcbc-dee0d07ed0f7/volumes" Mar 11 12:50:18 crc kubenswrapper[4816]: I0311 12:50:18.130670 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:18 crc kubenswrapper[4816]: E0311 12:50:18.131637 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:29 crc kubenswrapper[4816]: I0311 12:50:29.131219 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:29 crc kubenswrapper[4816]: E0311 12:50:29.132130 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:40 crc kubenswrapper[4816]: I0311 12:50:40.132150 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:40 crc kubenswrapper[4816]: E0311 12:50:40.133022 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:50:48 crc kubenswrapper[4816]: I0311 12:50:48.524286 4816 scope.go:117] "RemoveContainer" containerID="da680962b6fbbd0e75bc32153dd7114d5c7dd1b60db6d2fbbedf1eb60245a10a" Mar 11 12:50:53 crc kubenswrapper[4816]: I0311 12:50:53.132054 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:50:53 crc kubenswrapper[4816]: E0311 12:50:53.133233 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:04 crc kubenswrapper[4816]: I0311 12:51:04.135230 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:04 crc kubenswrapper[4816]: E0311 12:51:04.136174 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:17 crc kubenswrapper[4816]: I0311 12:51:17.131796 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:17 crc kubenswrapper[4816]: E0311 12:51:17.133627 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:29 crc kubenswrapper[4816]: I0311 12:51:29.130712 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:29 crc kubenswrapper[4816]: E0311 12:51:29.132101 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:42 crc kubenswrapper[4816]: I0311 12:51:42.130317 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:42 crc kubenswrapper[4816]: E0311 12:51:42.132946 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:51:57 crc kubenswrapper[4816]: I0311 12:51:57.131598 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:51:57 crc kubenswrapper[4816]: E0311 12:51:57.132654 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.184065 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:52:00 crc kubenswrapper[4816]: E0311 12:52:00.184923 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba83d28-3266-48ec-a66b-256e07e427c4" containerName="oc" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.184939 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba83d28-3266-48ec-a66b-256e07e427c4" containerName="oc" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.185170 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba83d28-3266-48ec-a66b-256e07e427c4" containerName="oc" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.185808 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.193774 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.194075 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.194165 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.196613 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.296867 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") pod \"auto-csr-approver-29553892-p8pvb\" (UID: \"0930f466-9688-4eee-a82d-54a22a037535\") " pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.399046 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") pod \"auto-csr-approver-29553892-p8pvb\" (UID: \"0930f466-9688-4eee-a82d-54a22a037535\") " pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.426071 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") pod \"auto-csr-approver-29553892-p8pvb\" (UID: \"0930f466-9688-4eee-a82d-54a22a037535\") " pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:00 crc kubenswrapper[4816]: I0311 12:52:00.519949 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:01 crc kubenswrapper[4816]: I0311 12:52:01.020648 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:52:01 crc kubenswrapper[4816]: I0311 12:52:01.027934 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:52:01 crc kubenswrapper[4816]: I0311 12:52:01.889515 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" event={"ID":"0930f466-9688-4eee-a82d-54a22a037535","Type":"ContainerStarted","Data":"0e5e84f0f22d19d1a1cc75bdc126647c2800a3edd6da5c83b52ecd79fd137b37"} Mar 11 12:52:02 crc kubenswrapper[4816]: I0311 12:52:02.902105 4816 generic.go:334] "Generic (PLEG): container finished" podID="0930f466-9688-4eee-a82d-54a22a037535" containerID="b23f59795fc03fe5ae5f308d14da26d3250f022f9dd89c94c78eb50bf14fca19" exitCode=0 Mar 11 12:52:02 crc kubenswrapper[4816]: I0311 12:52:02.902186 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" event={"ID":"0930f466-9688-4eee-a82d-54a22a037535","Type":"ContainerDied","Data":"b23f59795fc03fe5ae5f308d14da26d3250f022f9dd89c94c78eb50bf14fca19"} Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.322878 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.475965 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") pod \"0930f466-9688-4eee-a82d-54a22a037535\" (UID: \"0930f466-9688-4eee-a82d-54a22a037535\") " Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.483769 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh" (OuterVolumeSpecName: "kube-api-access-nxbbh") pod "0930f466-9688-4eee-a82d-54a22a037535" (UID: "0930f466-9688-4eee-a82d-54a22a037535"). InnerVolumeSpecName "kube-api-access-nxbbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.577862 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxbbh\" (UniqueName: \"kubernetes.io/projected/0930f466-9688-4eee-a82d-54a22a037535-kube-api-access-nxbbh\") on node \"crc\" DevicePath \"\"" Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.919679 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" event={"ID":"0930f466-9688-4eee-a82d-54a22a037535","Type":"ContainerDied","Data":"0e5e84f0f22d19d1a1cc75bdc126647c2800a3edd6da5c83b52ecd79fd137b37"} Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.919772 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e5e84f0f22d19d1a1cc75bdc126647c2800a3edd6da5c83b52ecd79fd137b37" Mar 11 12:52:04 crc kubenswrapper[4816]: I0311 12:52:04.919804 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553892-p8pvb" Mar 11 12:52:05 crc kubenswrapper[4816]: I0311 12:52:05.388483 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:52:05 crc kubenswrapper[4816]: I0311 12:52:05.394063 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553886-xjtpg"] Mar 11 12:52:06 crc kubenswrapper[4816]: I0311 12:52:06.140805 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10793112-fba0-46e4-a3a5-201255a72221" path="/var/lib/kubelet/pods/10793112-fba0-46e4-a3a5-201255a72221/volumes" Mar 11 12:52:12 crc kubenswrapper[4816]: I0311 12:52:12.130899 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:52:12 crc kubenswrapper[4816]: E0311 12:52:12.131470 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:52:26 crc kubenswrapper[4816]: I0311 12:52:26.131227 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:52:26 crc kubenswrapper[4816]: E0311 12:52:26.132898 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:52:38 crc kubenswrapper[4816]: I0311 12:52:38.130637 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:52:38 crc kubenswrapper[4816]: E0311 12:52:38.131764 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:52:48 crc kubenswrapper[4816]: I0311 12:52:48.638376 4816 scope.go:117] "RemoveContainer" containerID="015da5072b60f8b74ef45c6695076cb6f089d3139f01fe3ef7f7d86d8236f381" Mar 11 12:52:51 crc kubenswrapper[4816]: I0311 12:52:51.130571 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:52:51 crc kubenswrapper[4816]: E0311 12:52:51.131280 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:53:04 crc kubenswrapper[4816]: I0311 12:53:04.135135 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:53:04 crc kubenswrapper[4816]: E0311 12:53:04.136233 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:53:16 crc kubenswrapper[4816]: I0311 12:53:16.130858 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:53:16 crc kubenswrapper[4816]: I0311 12:53:16.504116 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d"} Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.781495 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:53:50 crc kubenswrapper[4816]: E0311 12:53:50.782389 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930f466-9688-4eee-a82d-54a22a037535" containerName="oc" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.782407 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930f466-9688-4eee-a82d-54a22a037535" containerName="oc" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.782581 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0930f466-9688-4eee-a82d-54a22a037535" containerName="oc" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.783707 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.806049 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.934171 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.934232 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:50 crc kubenswrapper[4816]: I0311 12:53:50.934306 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.037149 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.037342 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.037375 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.037993 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.038198 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.070270 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") pod \"redhat-operators-47gxf\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.127690 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.591781 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.810992 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerStarted","Data":"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2"} Mar 11 12:53:51 crc kubenswrapper[4816]: I0311 12:53:51.811044 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerStarted","Data":"418b4f749c0e0bfdb4875242a4075c3d65786f340cf1d80b61c4a416d7f3f702"} Mar 11 12:53:52 crc kubenswrapper[4816]: I0311 12:53:52.818458 4816 generic.go:334] "Generic (PLEG): container finished" podID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerID="3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2" exitCode=0 Mar 11 12:53:52 crc kubenswrapper[4816]: I0311 12:53:52.818572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerDied","Data":"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2"} Mar 11 12:53:53 crc kubenswrapper[4816]: I0311 12:53:53.829054 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerStarted","Data":"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607"} Mar 11 12:53:54 crc kubenswrapper[4816]: I0311 12:53:54.839231 4816 generic.go:334] "Generic (PLEG): container finished" podID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerID="0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607" exitCode=0 Mar 11 12:53:54 crc kubenswrapper[4816]: I0311 12:53:54.839359 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerDied","Data":"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607"} Mar 11 12:53:55 crc kubenswrapper[4816]: I0311 12:53:55.855731 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerStarted","Data":"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554"} Mar 11 12:53:55 crc kubenswrapper[4816]: I0311 12:53:55.882564 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-47gxf" podStartSLOduration=3.200894044 podStartE2EDuration="5.882531668s" podCreationTimestamp="2026-03-11 12:53:50 +0000 UTC" firstStartedPulling="2026-03-11 12:53:52.820261794 +0000 UTC m=+3319.411525761" lastFinishedPulling="2026-03-11 12:53:55.501899418 +0000 UTC m=+3322.093163385" observedRunningTime="2026-03-11 12:53:55.874229816 +0000 UTC m=+3322.465493783" watchObservedRunningTime="2026-03-11 12:53:55.882531668 +0000 UTC m=+3322.473795635" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.154682 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.156534 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.159462 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.159581 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.159880 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.168419 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.290938 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") pod \"auto-csr-approver-29553894-48d29\" (UID: \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\") " pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.392783 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") pod \"auto-csr-approver-29553894-48d29\" (UID: \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\") " pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.413348 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") pod \"auto-csr-approver-29553894-48d29\" (UID: \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\") " pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.476418 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:00 crc kubenswrapper[4816]: I0311 12:54:00.928224 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 12:54:00 crc kubenswrapper[4816]: W0311 12:54:00.938078 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca38c432_abfd_4e1c_8ea7_a0781390bb1d.slice/crio-2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25 WatchSource:0}: Error finding container 2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25: Status 404 returned error can't find the container with id 2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25 Mar 11 12:54:01 crc kubenswrapper[4816]: I0311 12:54:01.128672 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:01 crc kubenswrapper[4816]: I0311 12:54:01.128733 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:01 crc kubenswrapper[4816]: I0311 12:54:01.906051 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553894-48d29" event={"ID":"ca38c432-abfd-4e1c-8ea7-a0781390bb1d","Type":"ContainerStarted","Data":"2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25"} Mar 11 12:54:02 crc kubenswrapper[4816]: I0311 12:54:02.175868 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-47gxf" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" probeResult="failure" output=< Mar 11 12:54:02 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 12:54:02 crc kubenswrapper[4816]: > Mar 11 12:54:02 crc kubenswrapper[4816]: I0311 12:54:02.917851 4816 generic.go:334] "Generic (PLEG): container finished" podID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" containerID="f4e7fa686a33e8ded3e5e43526bdf4db23b8a91e490e0b84842982390eec6764" exitCode=0 Mar 11 12:54:02 crc kubenswrapper[4816]: I0311 12:54:02.918101 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553894-48d29" event={"ID":"ca38c432-abfd-4e1c-8ea7-a0781390bb1d","Type":"ContainerDied","Data":"f4e7fa686a33e8ded3e5e43526bdf4db23b8a91e490e0b84842982390eec6764"} Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.264755 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.372337 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") pod \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\" (UID: \"ca38c432-abfd-4e1c-8ea7-a0781390bb1d\") " Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.378713 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc" (OuterVolumeSpecName: "kube-api-access-dglvc") pod "ca38c432-abfd-4e1c-8ea7-a0781390bb1d" (UID: "ca38c432-abfd-4e1c-8ea7-a0781390bb1d"). InnerVolumeSpecName "kube-api-access-dglvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.475142 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dglvc\" (UniqueName: \"kubernetes.io/projected/ca38c432-abfd-4e1c-8ea7-a0781390bb1d-kube-api-access-dglvc\") on node \"crc\" DevicePath \"\"" Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.940136 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553894-48d29" event={"ID":"ca38c432-abfd-4e1c-8ea7-a0781390bb1d","Type":"ContainerDied","Data":"2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25"} Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.940270 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d455ef0cff5c60692ee00dc059c3e95a0c6daec4f3b1e7c64274d0cd5d2df25" Mar 11 12:54:04 crc kubenswrapper[4816]: I0311 12:54:04.940319 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553894-48d29" Mar 11 12:54:05 crc kubenswrapper[4816]: I0311 12:54:05.352939 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:54:05 crc kubenswrapper[4816]: I0311 12:54:05.359024 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553888-7sw8h"] Mar 11 12:54:06 crc kubenswrapper[4816]: I0311 12:54:06.147402 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a5e0fe-c52b-4b6f-ab13-ba73fce64177" path="/var/lib/kubelet/pods/f3a5e0fe-c52b-4b6f-ab13-ba73fce64177/volumes" Mar 11 12:54:11 crc kubenswrapper[4816]: I0311 12:54:11.178680 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:11 crc kubenswrapper[4816]: I0311 12:54:11.225784 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:11 crc kubenswrapper[4816]: I0311 12:54:11.415547 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.011966 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-47gxf" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" containerID="cri-o://14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" gracePeriod=2 Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.457063 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.544484 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") pod \"46b8010d-316f-425c-9e3a-69f771cc81a5\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.544632 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") pod \"46b8010d-316f-425c-9e3a-69f771cc81a5\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.544724 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") pod \"46b8010d-316f-425c-9e3a-69f771cc81a5\" (UID: \"46b8010d-316f-425c-9e3a-69f771cc81a5\") " Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.545864 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities" (OuterVolumeSpecName: "utilities") pod "46b8010d-316f-425c-9e3a-69f771cc81a5" (UID: "46b8010d-316f-425c-9e3a-69f771cc81a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.552986 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v" (OuterVolumeSpecName: "kube-api-access-dll6v") pod "46b8010d-316f-425c-9e3a-69f771cc81a5" (UID: "46b8010d-316f-425c-9e3a-69f771cc81a5"). InnerVolumeSpecName "kube-api-access-dll6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.646745 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.646790 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dll6v\" (UniqueName: \"kubernetes.io/projected/46b8010d-316f-425c-9e3a-69f771cc81a5-kube-api-access-dll6v\") on node \"crc\" DevicePath \"\"" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.699426 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46b8010d-316f-425c-9e3a-69f771cc81a5" (UID: "46b8010d-316f-425c-9e3a-69f771cc81a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:54:13 crc kubenswrapper[4816]: I0311 12:54:13.748678 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b8010d-316f-425c-9e3a-69f771cc81a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.028062 4816 generic.go:334] "Generic (PLEG): container finished" podID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerID="14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" exitCode=0 Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.028131 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerDied","Data":"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554"} Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.028213 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47gxf" event={"ID":"46b8010d-316f-425c-9e3a-69f771cc81a5","Type":"ContainerDied","Data":"418b4f749c0e0bfdb4875242a4075c3d65786f340cf1d80b61c4a416d7f3f702"} Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.028261 4816 scope.go:117] "RemoveContainer" containerID="14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.029726 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47gxf" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.065603 4816 scope.go:117] "RemoveContainer" containerID="0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.077329 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.086888 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-47gxf"] Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.088923 4816 scope.go:117] "RemoveContainer" containerID="3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.109736 4816 scope.go:117] "RemoveContainer" containerID="14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" Mar 11 12:54:14 crc kubenswrapper[4816]: E0311 12:54:14.110870 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554\": container with ID starting with 14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554 not found: ID does not exist" containerID="14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.110927 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554"} err="failed to get container status \"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554\": rpc error: code = NotFound desc = could not find container \"14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554\": container with ID starting with 14ab2555c6f27d768806e6f256aaa29847dcea835676309926e23e0478d97554 not found: ID does not exist" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.110968 4816 scope.go:117] "RemoveContainer" containerID="0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607" Mar 11 12:54:14 crc kubenswrapper[4816]: E0311 12:54:14.111486 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607\": container with ID starting with 0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607 not found: ID does not exist" containerID="0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.111508 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607"} err="failed to get container status \"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607\": rpc error: code = NotFound desc = could not find container \"0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607\": container with ID starting with 0b9be8b59dbf94de7f07db8b17733685c0fb81071870240f778dda4ca448f607 not found: ID does not exist" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.111523 4816 scope.go:117] "RemoveContainer" containerID="3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2" Mar 11 12:54:14 crc kubenswrapper[4816]: E0311 12:54:14.111760 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2\": container with ID starting with 3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2 not found: ID does not exist" containerID="3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.111785 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2"} err="failed to get container status \"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2\": rpc error: code = NotFound desc = could not find container \"3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2\": container with ID starting with 3b189802d6a873408cb8ad3686ac7d0ab5de12010d3d6f7a4a4fbb106d1193a2 not found: ID does not exist" Mar 11 12:54:14 crc kubenswrapper[4816]: I0311 12:54:14.141124 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" path="/var/lib/kubelet/pods/46b8010d-316f-425c-9e3a-69f771cc81a5/volumes" Mar 11 12:54:48 crc kubenswrapper[4816]: I0311 12:54:48.722613 4816 scope.go:117] "RemoveContainer" containerID="9ed1a5e43552ff0476bd301f6f56de7c0e4f936f582bd894ea6e5569ba2db74d" Mar 11 12:55:39 crc kubenswrapper[4816]: I0311 12:55:39.515392 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:55:39 crc kubenswrapper[4816]: I0311 12:55:39.516165 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.153735 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 12:56:00 crc kubenswrapper[4816]: E0311 12:56:00.154965 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="extract-content" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.154995 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="extract-content" Mar 11 12:56:00 crc kubenswrapper[4816]: E0311 12:56:00.155014 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" containerName="oc" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155028 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" containerName="oc" Mar 11 12:56:00 crc kubenswrapper[4816]: E0311 12:56:00.155066 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155081 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" Mar 11 12:56:00 crc kubenswrapper[4816]: E0311 12:56:00.155114 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="extract-utilities" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155126 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="extract-utilities" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155421 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" containerName="oc" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.155463 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b8010d-316f-425c-9e3a-69f771cc81a5" containerName="registry-server" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.156341 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.162038 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.173966 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.178865 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.179138 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.293907 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") pod \"auto-csr-approver-29553896-lxt87\" (UID: \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\") " pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.395087 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") pod \"auto-csr-approver-29553896-lxt87\" (UID: \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\") " pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.422185 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") pod \"auto-csr-approver-29553896-lxt87\" (UID: \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\") " pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.486802 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.902090 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 12:56:00 crc kubenswrapper[4816]: I0311 12:56:00.954554 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553896-lxt87" event={"ID":"2438ebe2-3bab-42fc-9430-8b2600a2efd1","Type":"ContainerStarted","Data":"e5a7905530685f7effb3274c90abdf63d08adb1d15899784b209bb6b6b52b86a"} Mar 11 12:56:02 crc kubenswrapper[4816]: I0311 12:56:02.979540 4816 generic.go:334] "Generic (PLEG): container finished" podID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" containerID="78e7b5f4d85a6eb5009a8b4bbf6ee9389d9e1a205f3bf787bb243af2af671b70" exitCode=0 Mar 11 12:56:02 crc kubenswrapper[4816]: I0311 12:56:02.980383 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553896-lxt87" event={"ID":"2438ebe2-3bab-42fc-9430-8b2600a2efd1","Type":"ContainerDied","Data":"78e7b5f4d85a6eb5009a8b4bbf6ee9389d9e1a205f3bf787bb243af2af671b70"} Mar 11 12:56:04 crc kubenswrapper[4816]: I0311 12:56:04.371404 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:04 crc kubenswrapper[4816]: I0311 12:56:04.471146 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") pod \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\" (UID: \"2438ebe2-3bab-42fc-9430-8b2600a2efd1\") " Mar 11 12:56:04 crc kubenswrapper[4816]: I0311 12:56:04.477716 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh" (OuterVolumeSpecName: "kube-api-access-shlvh") pod "2438ebe2-3bab-42fc-9430-8b2600a2efd1" (UID: "2438ebe2-3bab-42fc-9430-8b2600a2efd1"). InnerVolumeSpecName "kube-api-access-shlvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:56:04 crc kubenswrapper[4816]: I0311 12:56:04.573110 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shlvh\" (UniqueName: \"kubernetes.io/projected/2438ebe2-3bab-42fc-9430-8b2600a2efd1-kube-api-access-shlvh\") on node \"crc\" DevicePath \"\"" Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.002104 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553896-lxt87" event={"ID":"2438ebe2-3bab-42fc-9430-8b2600a2efd1","Type":"ContainerDied","Data":"e5a7905530685f7effb3274c90abdf63d08adb1d15899784b209bb6b6b52b86a"} Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.002187 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553896-lxt87" Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.002205 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5a7905530685f7effb3274c90abdf63d08adb1d15899784b209bb6b6b52b86a" Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.452961 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:56:05 crc kubenswrapper[4816]: I0311 12:56:05.453043 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553890-4jjhs"] Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.145345 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba83d28-3266-48ec-a66b-256e07e427c4" path="/var/lib/kubelet/pods/3ba83d28-3266-48ec-a66b-256e07e427c4/volumes" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.553922 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:06 crc kubenswrapper[4816]: E0311 12:56:06.554375 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" containerName="oc" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.554392 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" containerName="oc" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.554587 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" containerName="oc" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.555814 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.573374 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.705214 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.705314 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.705576 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.807480 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.807591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.807668 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.808207 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.808261 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.833312 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") pod \"community-operators-gwmbx\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:06 crc kubenswrapper[4816]: I0311 12:56:06.879288 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:07 crc kubenswrapper[4816]: I0311 12:56:07.216075 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:07 crc kubenswrapper[4816]: E0311 12:56:07.650017 4816 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48071a6c_027a_4069_8c66_49fe9309a163.slice/crio-conmon-54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3.scope\": RecentStats: unable to find data in memory cache]" Mar 11 12:56:08 crc kubenswrapper[4816]: I0311 12:56:08.030310 4816 generic.go:334] "Generic (PLEG): container finished" podID="48071a6c-027a-4069-8c66-49fe9309a163" containerID="54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3" exitCode=0 Mar 11 12:56:08 crc kubenswrapper[4816]: I0311 12:56:08.030434 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerDied","Data":"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3"} Mar 11 12:56:08 crc kubenswrapper[4816]: I0311 12:56:08.030872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerStarted","Data":"82a7c6ba67dd3a8cd1f47261f2f32235365ba7093768c2ad5c0634571d84ccf9"} Mar 11 12:56:09 crc kubenswrapper[4816]: I0311 12:56:09.041335 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerStarted","Data":"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f"} Mar 11 12:56:09 crc kubenswrapper[4816]: I0311 12:56:09.514915 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:56:09 crc kubenswrapper[4816]: I0311 12:56:09.515054 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:56:10 crc kubenswrapper[4816]: I0311 12:56:10.050598 4816 generic.go:334] "Generic (PLEG): container finished" podID="48071a6c-027a-4069-8c66-49fe9309a163" containerID="0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f" exitCode=0 Mar 11 12:56:10 crc kubenswrapper[4816]: I0311 12:56:10.051899 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerDied","Data":"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f"} Mar 11 12:56:11 crc kubenswrapper[4816]: I0311 12:56:11.067541 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerStarted","Data":"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d"} Mar 11 12:56:11 crc kubenswrapper[4816]: I0311 12:56:11.093267 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gwmbx" podStartSLOduration=2.666236365 podStartE2EDuration="5.093219375s" podCreationTimestamp="2026-03-11 12:56:06 +0000 UTC" firstStartedPulling="2026-03-11 12:56:08.031759143 +0000 UTC m=+3454.623023110" lastFinishedPulling="2026-03-11 12:56:10.458742153 +0000 UTC m=+3457.050006120" observedRunningTime="2026-03-11 12:56:11.089928133 +0000 UTC m=+3457.681192100" watchObservedRunningTime="2026-03-11 12:56:11.093219375 +0000 UTC m=+3457.684483342" Mar 11 12:56:16 crc kubenswrapper[4816]: I0311 12:56:16.879958 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:16 crc kubenswrapper[4816]: I0311 12:56:16.881141 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:16 crc kubenswrapper[4816]: I0311 12:56:16.937738 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:17 crc kubenswrapper[4816]: I0311 12:56:17.193748 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:17 crc kubenswrapper[4816]: I0311 12:56:17.259351 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:19 crc kubenswrapper[4816]: I0311 12:56:19.139474 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gwmbx" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="registry-server" containerID="cri-o://aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" gracePeriod=2 Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.100099 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.152260 4816 generic.go:334] "Generic (PLEG): container finished" podID="48071a6c-027a-4069-8c66-49fe9309a163" containerID="aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" exitCode=0 Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.152402 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwmbx" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.155822 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerDied","Data":"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d"} Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.155888 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwmbx" event={"ID":"48071a6c-027a-4069-8c66-49fe9309a163","Type":"ContainerDied","Data":"82a7c6ba67dd3a8cd1f47261f2f32235365ba7093768c2ad5c0634571d84ccf9"} Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.155913 4816 scope.go:117] "RemoveContainer" containerID="aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.174426 4816 scope.go:117] "RemoveContainer" containerID="0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.180117 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") pod \"48071a6c-027a-4069-8c66-49fe9309a163\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.180459 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") pod \"48071a6c-027a-4069-8c66-49fe9309a163\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.180507 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") pod \"48071a6c-027a-4069-8c66-49fe9309a163\" (UID: \"48071a6c-027a-4069-8c66-49fe9309a163\") " Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.181554 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities" (OuterVolumeSpecName: "utilities") pod "48071a6c-027a-4069-8c66-49fe9309a163" (UID: "48071a6c-027a-4069-8c66-49fe9309a163"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.190189 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z" (OuterVolumeSpecName: "kube-api-access-lcm9z") pod "48071a6c-027a-4069-8c66-49fe9309a163" (UID: "48071a6c-027a-4069-8c66-49fe9309a163"). InnerVolumeSpecName "kube-api-access-lcm9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.194607 4816 scope.go:117] "RemoveContainer" containerID="54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.241552 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48071a6c-027a-4069-8c66-49fe9309a163" (UID: "48071a6c-027a-4069-8c66-49fe9309a163"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.245635 4816 scope.go:117] "RemoveContainer" containerID="aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" Mar 11 12:56:20 crc kubenswrapper[4816]: E0311 12:56:20.246172 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d\": container with ID starting with aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d not found: ID does not exist" containerID="aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.246241 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d"} err="failed to get container status \"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d\": rpc error: code = NotFound desc = could not find container \"aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d\": container with ID starting with aa98f0292f08a6bf3d38bb810f0ef7dab461e46b3651e23c7c5e5138e308488d not found: ID does not exist" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.246310 4816 scope.go:117] "RemoveContainer" containerID="0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f" Mar 11 12:56:20 crc kubenswrapper[4816]: E0311 12:56:20.246698 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f\": container with ID starting with 0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f not found: ID does not exist" containerID="0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.246778 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f"} err="failed to get container status \"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f\": rpc error: code = NotFound desc = could not find container \"0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f\": container with ID starting with 0170356183f0124b4b7ed6bfdad54e871522af344f7374e05ff5d111c5dc4d1f not found: ID does not exist" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.246813 4816 scope.go:117] "RemoveContainer" containerID="54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3" Mar 11 12:56:20 crc kubenswrapper[4816]: E0311 12:56:20.247093 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3\": container with ID starting with 54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3 not found: ID does not exist" containerID="54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.247124 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3"} err="failed to get container status \"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3\": rpc error: code = NotFound desc = could not find container \"54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3\": container with ID starting with 54f09b00e99cb668ef04e9dae162ba6d3025018a6bafd9ab40af08257c930ec3 not found: ID does not exist" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.283070 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.283120 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcm9z\" (UniqueName: \"kubernetes.io/projected/48071a6c-027a-4069-8c66-49fe9309a163-kube-api-access-lcm9z\") on node \"crc\" DevicePath \"\"" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.283134 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48071a6c-027a-4069-8c66-49fe9309a163-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.496204 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:20 crc kubenswrapper[4816]: I0311 12:56:20.502663 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gwmbx"] Mar 11 12:56:22 crc kubenswrapper[4816]: I0311 12:56:22.144235 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48071a6c-027a-4069-8c66-49fe9309a163" path="/var/lib/kubelet/pods/48071a6c-027a-4069-8c66-49fe9309a163/volumes" Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.515377 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.516188 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.516314 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.517511 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:56:39 crc kubenswrapper[4816]: I0311 12:56:39.517638 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d" gracePeriod=600 Mar 11 12:56:40 crc kubenswrapper[4816]: I0311 12:56:40.327551 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d" exitCode=0 Mar 11 12:56:40 crc kubenswrapper[4816]: I0311 12:56:40.327638 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d"} Mar 11 12:56:40 crc kubenswrapper[4816]: I0311 12:56:40.328107 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5"} Mar 11 12:56:40 crc kubenswrapper[4816]: I0311 12:56:40.328145 4816 scope.go:117] "RemoveContainer" containerID="64508c6e69343dd1765463d9117cc772933b01ec1363a188e021731915106334" Mar 11 12:56:48 crc kubenswrapper[4816]: I0311 12:56:48.858324 4816 scope.go:117] "RemoveContainer" containerID="24d57408e6d94ff8c7de8f3b9883efb12d44570ac11551e03845bb25056d71b0" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.163286 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 12:58:00 crc kubenswrapper[4816]: E0311 12:58:00.164592 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="extract-utilities" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.164616 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="extract-utilities" Mar 11 12:58:00 crc kubenswrapper[4816]: E0311 12:58:00.164645 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="extract-content" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.164654 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="extract-content" Mar 11 12:58:00 crc kubenswrapper[4816]: E0311 12:58:00.164669 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="registry-server" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.164679 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="registry-server" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.164867 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="48071a6c-027a-4069-8c66-49fe9309a163" containerName="registry-server" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.165617 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.168862 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.169602 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.170972 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.171360 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.273071 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") pod \"auto-csr-approver-29553898-tcpck\" (UID: \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\") " pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.374687 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") pod \"auto-csr-approver-29553898-tcpck\" (UID: \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\") " pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.396032 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") pod \"auto-csr-approver-29553898-tcpck\" (UID: \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\") " pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.486152 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.978733 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 12:58:00 crc kubenswrapper[4816]: I0311 12:58:00.981346 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 12:58:01 crc kubenswrapper[4816]: I0311 12:58:01.023321 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553898-tcpck" event={"ID":"30f75061-6a64-4c1d-b9f9-77f6425ad4c5","Type":"ContainerStarted","Data":"0e4d0465c032f5dec6dcec2a7498ded923a9f68afec618eb9cc057d9458104d0"} Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.456217 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.458259 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.466327 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.617417 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.617543 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.617572 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.719734 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.719850 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.719884 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.720636 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.720678 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.749497 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") pod \"redhat-marketplace-zkv9h\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:02 crc kubenswrapper[4816]: I0311 12:58:02.775422 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:03 crc kubenswrapper[4816]: I0311 12:58:03.045617 4816 generic.go:334] "Generic (PLEG): container finished" podID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" containerID="0c0de876588cbf0205a01555bac817ffc9ad65f6cabe6192282136fce8802326" exitCode=0 Mar 11 12:58:03 crc kubenswrapper[4816]: I0311 12:58:03.045987 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553898-tcpck" event={"ID":"30f75061-6a64-4c1d-b9f9-77f6425ad4c5","Type":"ContainerDied","Data":"0c0de876588cbf0205a01555bac817ffc9ad65f6cabe6192282136fce8802326"} Mar 11 12:58:03 crc kubenswrapper[4816]: I0311 12:58:03.236869 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.058262 4816 generic.go:334] "Generic (PLEG): container finished" podID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerID="2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405" exitCode=0 Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.058361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerDied","Data":"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405"} Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.058455 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerStarted","Data":"8599988ebbf02ada5db5c82da2f9c61c4470d15bbd29823853b790abf1c0ac6e"} Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.405569 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.549305 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") pod \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\" (UID: \"30f75061-6a64-4c1d-b9f9-77f6425ad4c5\") " Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.557153 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk" (OuterVolumeSpecName: "kube-api-access-xc7mk") pod "30f75061-6a64-4c1d-b9f9-77f6425ad4c5" (UID: "30f75061-6a64-4c1d-b9f9-77f6425ad4c5"). InnerVolumeSpecName "kube-api-access-xc7mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:58:04 crc kubenswrapper[4816]: I0311 12:58:04.651676 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7mk\" (UniqueName: \"kubernetes.io/projected/30f75061-6a64-4c1d-b9f9-77f6425ad4c5-kube-api-access-xc7mk\") on node \"crc\" DevicePath \"\"" Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.067571 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerStarted","Data":"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126"} Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.071354 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553898-tcpck" event={"ID":"30f75061-6a64-4c1d-b9f9-77f6425ad4c5","Type":"ContainerDied","Data":"0e4d0465c032f5dec6dcec2a7498ded923a9f68afec618eb9cc057d9458104d0"} Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.071391 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e4d0465c032f5dec6dcec2a7498ded923a9f68afec618eb9cc057d9458104d0" Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.071441 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553898-tcpck" Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.531952 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:58:05 crc kubenswrapper[4816]: I0311 12:58:05.538203 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553892-p8pvb"] Mar 11 12:58:06 crc kubenswrapper[4816]: I0311 12:58:06.094856 4816 generic.go:334] "Generic (PLEG): container finished" podID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerID="66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126" exitCode=0 Mar 11 12:58:06 crc kubenswrapper[4816]: I0311 12:58:06.095477 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerDied","Data":"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126"} Mar 11 12:58:06 crc kubenswrapper[4816]: I0311 12:58:06.141390 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0930f466-9688-4eee-a82d-54a22a037535" path="/var/lib/kubelet/pods/0930f466-9688-4eee-a82d-54a22a037535/volumes" Mar 11 12:58:07 crc kubenswrapper[4816]: I0311 12:58:07.107024 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerStarted","Data":"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8"} Mar 11 12:58:07 crc kubenswrapper[4816]: I0311 12:58:07.133654 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zkv9h" podStartSLOduration=2.690883193 podStartE2EDuration="5.133626414s" podCreationTimestamp="2026-03-11 12:58:02 +0000 UTC" firstStartedPulling="2026-03-11 12:58:04.061608568 +0000 UTC m=+3570.652872545" lastFinishedPulling="2026-03-11 12:58:06.504351799 +0000 UTC m=+3573.095615766" observedRunningTime="2026-03-11 12:58:07.126569162 +0000 UTC m=+3573.717833129" watchObservedRunningTime="2026-03-11 12:58:07.133626414 +0000 UTC m=+3573.724890381" Mar 11 12:58:12 crc kubenswrapper[4816]: I0311 12:58:12.776066 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:12 crc kubenswrapper[4816]: I0311 12:58:12.776563 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:12 crc kubenswrapper[4816]: I0311 12:58:12.844558 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:13 crc kubenswrapper[4816]: I0311 12:58:13.223288 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:13 crc kubenswrapper[4816]: I0311 12:58:13.281341 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.191921 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zkv9h" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="registry-server" containerID="cri-o://f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" gracePeriod=2 Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.677444 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.844058 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") pod \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.844222 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") pod \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.844549 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") pod \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\" (UID: \"d2ffaffb-3ae6-4747-b335-142213b1f4b5\") " Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.845945 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities" (OuterVolumeSpecName: "utilities") pod "d2ffaffb-3ae6-4747-b335-142213b1f4b5" (UID: "d2ffaffb-3ae6-4747-b335-142213b1f4b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.852615 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2" (OuterVolumeSpecName: "kube-api-access-2j7k2") pod "d2ffaffb-3ae6-4747-b335-142213b1f4b5" (UID: "d2ffaffb-3ae6-4747-b335-142213b1f4b5"). InnerVolumeSpecName "kube-api-access-2j7k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.882976 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2ffaffb-3ae6-4747-b335-142213b1f4b5" (UID: "d2ffaffb-3ae6-4747-b335-142213b1f4b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.946472 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.946505 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ffaffb-3ae6-4747-b335-142213b1f4b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 12:58:15 crc kubenswrapper[4816]: I0311 12:58:15.946517 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7k2\" (UniqueName: \"kubernetes.io/projected/d2ffaffb-3ae6-4747-b335-142213b1f4b5-kube-api-access-2j7k2\") on node \"crc\" DevicePath \"\"" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209443 4816 generic.go:334] "Generic (PLEG): container finished" podID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerID="f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" exitCode=0 Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209552 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerDied","Data":"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8"} Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209632 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zkv9h" event={"ID":"d2ffaffb-3ae6-4747-b335-142213b1f4b5","Type":"ContainerDied","Data":"8599988ebbf02ada5db5c82da2f9c61c4470d15bbd29823853b790abf1c0ac6e"} Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209677 4816 scope.go:117] "RemoveContainer" containerID="f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.209973 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zkv9h" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.245471 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.251558 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zkv9h"] Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.264818 4816 scope.go:117] "RemoveContainer" containerID="66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.285423 4816 scope.go:117] "RemoveContainer" containerID="2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.315855 4816 scope.go:117] "RemoveContainer" containerID="f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" Mar 11 12:58:16 crc kubenswrapper[4816]: E0311 12:58:16.316390 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8\": container with ID starting with f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8 not found: ID does not exist" containerID="f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.316434 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8"} err="failed to get container status \"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8\": rpc error: code = NotFound desc = could not find container \"f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8\": container with ID starting with f2fc6b35ea2e58e063a0ac329a00dc0849f16be3473ed5b8fe3c6ff4365618b8 not found: ID does not exist" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.316464 4816 scope.go:117] "RemoveContainer" containerID="66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126" Mar 11 12:58:16 crc kubenswrapper[4816]: E0311 12:58:16.316746 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126\": container with ID starting with 66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126 not found: ID does not exist" containerID="66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.316773 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126"} err="failed to get container status \"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126\": rpc error: code = NotFound desc = could not find container \"66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126\": container with ID starting with 66b3dcac7834e6d5440e328ad42960ab25abf89c447dbc321630cf0d3cf95126 not found: ID does not exist" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.316788 4816 scope.go:117] "RemoveContainer" containerID="2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405" Mar 11 12:58:16 crc kubenswrapper[4816]: E0311 12:58:16.317127 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405\": container with ID starting with 2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405 not found: ID does not exist" containerID="2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405" Mar 11 12:58:16 crc kubenswrapper[4816]: I0311 12:58:16.317163 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405"} err="failed to get container status \"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405\": rpc error: code = NotFound desc = could not find container \"2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405\": container with ID starting with 2cd9cb2aa48fe25fd083e62ab793ddc96846d698efd4d18758135390a84de405 not found: ID does not exist" Mar 11 12:58:18 crc kubenswrapper[4816]: I0311 12:58:18.145705 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" path="/var/lib/kubelet/pods/d2ffaffb-3ae6-4747-b335-142213b1f4b5/volumes" Mar 11 12:58:39 crc kubenswrapper[4816]: I0311 12:58:39.515975 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:58:39 crc kubenswrapper[4816]: I0311 12:58:39.516689 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:58:49 crc kubenswrapper[4816]: I0311 12:58:49.003800 4816 scope.go:117] "RemoveContainer" containerID="b23f59795fc03fe5ae5f308d14da26d3250f022f9dd89c94c78eb50bf14fca19" Mar 11 12:59:09 crc kubenswrapper[4816]: I0311 12:59:09.515868 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:59:09 crc kubenswrapper[4816]: I0311 12:59:09.516873 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.514784 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.515327 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.515377 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.516031 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 12:59:39 crc kubenswrapper[4816]: I0311 12:59:39.516090 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" gracePeriod=600 Mar 11 12:59:39 crc kubenswrapper[4816]: E0311 12:59:39.641011 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:59:40 crc kubenswrapper[4816]: I0311 12:59:40.009130 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" exitCode=0 Mar 11 12:59:40 crc kubenswrapper[4816]: I0311 12:59:40.009189 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5"} Mar 11 12:59:40 crc kubenswrapper[4816]: I0311 12:59:40.009231 4816 scope.go:117] "RemoveContainer" containerID="82969b44556ee232154d78ccdc1672ee0b8f8d60f9110d4d7c57547eaa3f598d" Mar 11 12:59:40 crc kubenswrapper[4816]: I0311 12:59:40.009814 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 12:59:40 crc kubenswrapper[4816]: E0311 12:59:40.010043 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 12:59:55 crc kubenswrapper[4816]: I0311 12:59:55.131348 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 12:59:55 crc kubenswrapper[4816]: E0311 12:59:55.134990 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.153664 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:00:00 crc kubenswrapper[4816]: E0311 13:00:00.154685 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" containerName="oc" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154703 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" containerName="oc" Mar 11 13:00:00 crc kubenswrapper[4816]: E0311 13:00:00.154728 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="registry-server" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154737 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="registry-server" Mar 11 13:00:00 crc kubenswrapper[4816]: E0311 13:00:00.154753 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="extract-content" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154759 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="extract-content" Mar 11 13:00:00 crc kubenswrapper[4816]: E0311 13:00:00.154779 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="extract-utilities" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154787 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="extract-utilities" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.154976 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" containerName="oc" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.155014 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ffaffb-3ae6-4747-b335-142213b1f4b5" containerName="registry-server" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.155657 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.158744 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.159435 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.159655 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.163171 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.255379 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq"] Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.256978 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.259556 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.259765 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.264912 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq"] Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.297793 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") pod \"auto-csr-approver-29553900-h76nw\" (UID: \"9d463a78-830d-4b86-830a-e70345993927\") " pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.400020 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.400449 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.400658 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") pod \"auto-csr-approver-29553900-h76nw\" (UID: \"9d463a78-830d-4b86-830a-e70345993927\") " pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.400724 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.422544 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") pod \"auto-csr-approver-29553900-h76nw\" (UID: \"9d463a78-830d-4b86-830a-e70345993927\") " pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.477155 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.502462 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.502529 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.502573 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.503940 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.508919 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.520218 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") pod \"collect-profiles-29553900-v98kq\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.574910 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:00 crc kubenswrapper[4816]: I0311 13:00:00.933601 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.039561 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq"] Mar 11 13:00:01 crc kubenswrapper[4816]: W0311 13:00:01.040740 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf461fc9a_2ced_499e_a8a3_ab129c298ea7.slice/crio-486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9 WatchSource:0}: Error finding container 486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9: Status 404 returned error can't find the container with id 486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9 Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.224298 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553900-h76nw" event={"ID":"9d463a78-830d-4b86-830a-e70345993927","Type":"ContainerStarted","Data":"5d2ed3071d3084d6bd7e9cb365ae82c2e8219d1bd89320437712935bcd615238"} Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.226948 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" event={"ID":"f461fc9a-2ced-499e-a8a3-ab129c298ea7","Type":"ContainerStarted","Data":"5710219c402e66dc0b9662cdba2a41be288b420909be4715c88b70adba89aff6"} Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.227007 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" event={"ID":"f461fc9a-2ced-499e-a8a3-ab129c298ea7","Type":"ContainerStarted","Data":"486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9"} Mar 11 13:00:01 crc kubenswrapper[4816]: I0311 13:00:01.247490 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" podStartSLOduration=1.247464675 podStartE2EDuration="1.247464675s" podCreationTimestamp="2026-03-11 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 13:00:01.242981507 +0000 UTC m=+3687.834245474" watchObservedRunningTime="2026-03-11 13:00:01.247464675 +0000 UTC m=+3687.838728642" Mar 11 13:00:02 crc kubenswrapper[4816]: I0311 13:00:02.237466 4816 generic.go:334] "Generic (PLEG): container finished" podID="f461fc9a-2ced-499e-a8a3-ab129c298ea7" containerID="5710219c402e66dc0b9662cdba2a41be288b420909be4715c88b70adba89aff6" exitCode=0 Mar 11 13:00:02 crc kubenswrapper[4816]: I0311 13:00:02.237529 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" event={"ID":"f461fc9a-2ced-499e-a8a3-ab129c298ea7","Type":"ContainerDied","Data":"5710219c402e66dc0b9662cdba2a41be288b420909be4715c88b70adba89aff6"} Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.553933 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.657195 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") pod \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.657365 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") pod \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.657445 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") pod \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\" (UID: \"f461fc9a-2ced-499e-a8a3-ab129c298ea7\") " Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.658458 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume" (OuterVolumeSpecName: "config-volume") pod "f461fc9a-2ced-499e-a8a3-ab129c298ea7" (UID: "f461fc9a-2ced-499e-a8a3-ab129c298ea7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.664448 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d" (OuterVolumeSpecName: "kube-api-access-kk42d") pod "f461fc9a-2ced-499e-a8a3-ab129c298ea7" (UID: "f461fc9a-2ced-499e-a8a3-ab129c298ea7"). InnerVolumeSpecName "kube-api-access-kk42d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.664437 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f461fc9a-2ced-499e-a8a3-ab129c298ea7" (UID: "f461fc9a-2ced-499e-a8a3-ab129c298ea7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.759392 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f461fc9a-2ced-499e-a8a3-ab129c298ea7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.759437 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk42d\" (UniqueName: \"kubernetes.io/projected/f461fc9a-2ced-499e-a8a3-ab129c298ea7-kube-api-access-kk42d\") on node \"crc\" DevicePath \"\"" Mar 11 13:00:03 crc kubenswrapper[4816]: I0311 13:00:03.759450 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f461fc9a-2ced-499e-a8a3-ab129c298ea7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.255642 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" event={"ID":"f461fc9a-2ced-499e-a8a3-ab129c298ea7","Type":"ContainerDied","Data":"486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9"} Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.255707 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486aaa9f185ddfc81cdba8d3b57cd9ce1b56ffa613bf5a5e3b8dd6e1a0bf75e9" Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.255714 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553900-v98kq" Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.321324 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 13:00:04 crc kubenswrapper[4816]: I0311 13:00:04.326443 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553855-l4sqr"] Mar 11 13:00:05 crc kubenswrapper[4816]: I0311 13:00:05.265833 4816 generic.go:334] "Generic (PLEG): container finished" podID="9d463a78-830d-4b86-830a-e70345993927" containerID="02f79ceb28719ec9aa00f051068012e5f7850ccf8b02f5d8f4ecbb73c01a94f5" exitCode=0 Mar 11 13:00:05 crc kubenswrapper[4816]: I0311 13:00:05.265886 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553900-h76nw" event={"ID":"9d463a78-830d-4b86-830a-e70345993927","Type":"ContainerDied","Data":"02f79ceb28719ec9aa00f051068012e5f7850ccf8b02f5d8f4ecbb73c01a94f5"} Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.136389 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:06 crc kubenswrapper[4816]: E0311 13:00:06.136640 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.169711 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a876e965-7c6d-4773-9c9b-f445411c559b" path="/var/lib/kubelet/pods/a876e965-7c6d-4773-9c9b-f445411c559b/volumes" Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.527078 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.706932 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") pod \"9d463a78-830d-4b86-830a-e70345993927\" (UID: \"9d463a78-830d-4b86-830a-e70345993927\") " Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.712030 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz" (OuterVolumeSpecName: "kube-api-access-9klbz") pod "9d463a78-830d-4b86-830a-e70345993927" (UID: "9d463a78-830d-4b86-830a-e70345993927"). InnerVolumeSpecName "kube-api-access-9klbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:00:06 crc kubenswrapper[4816]: I0311 13:00:06.809073 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9klbz\" (UniqueName: \"kubernetes.io/projected/9d463a78-830d-4b86-830a-e70345993927-kube-api-access-9klbz\") on node \"crc\" DevicePath \"\"" Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.283603 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553900-h76nw" event={"ID":"9d463a78-830d-4b86-830a-e70345993927","Type":"ContainerDied","Data":"5d2ed3071d3084d6bd7e9cb365ae82c2e8219d1bd89320437712935bcd615238"} Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.283684 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d2ed3071d3084d6bd7e9cb365ae82c2e8219d1bd89320437712935bcd615238" Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.284201 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553900-h76nw" Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.582830 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 13:00:07 crc kubenswrapper[4816]: I0311 13:00:07.587897 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553894-48d29"] Mar 11 13:00:08 crc kubenswrapper[4816]: I0311 13:00:08.142579 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca38c432-abfd-4e1c-8ea7-a0781390bb1d" path="/var/lib/kubelet/pods/ca38c432-abfd-4e1c-8ea7-a0781390bb1d/volumes" Mar 11 13:00:17 crc kubenswrapper[4816]: I0311 13:00:17.130624 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:17 crc kubenswrapper[4816]: E0311 13:00:17.131390 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:31 crc kubenswrapper[4816]: I0311 13:00:31.132016 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:31 crc kubenswrapper[4816]: E0311 13:00:31.135277 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:44 crc kubenswrapper[4816]: I0311 13:00:44.135119 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:44 crc kubenswrapper[4816]: E0311 13:00:44.136394 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:00:49 crc kubenswrapper[4816]: I0311 13:00:49.121558 4816 scope.go:117] "RemoveContainer" containerID="f55a9848386a64adca827b95cdc172bd623f9f4d2757b50c73cba6bd74ab25e2" Mar 11 13:00:49 crc kubenswrapper[4816]: I0311 13:00:49.151613 4816 scope.go:117] "RemoveContainer" containerID="f4e7fa686a33e8ded3e5e43526bdf4db23b8a91e490e0b84842982390eec6764" Mar 11 13:00:59 crc kubenswrapper[4816]: I0311 13:00:59.130919 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:00:59 crc kubenswrapper[4816]: E0311 13:00:59.132151 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:01:14 crc kubenswrapper[4816]: I0311 13:01:14.137799 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:01:14 crc kubenswrapper[4816]: E0311 13:01:14.138660 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:01:29 crc kubenswrapper[4816]: I0311 13:01:29.130228 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:01:29 crc kubenswrapper[4816]: E0311 13:01:29.131060 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:01:44 crc kubenswrapper[4816]: I0311 13:01:44.139808 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:01:44 crc kubenswrapper[4816]: E0311 13:01:44.141031 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:01:56 crc kubenswrapper[4816]: I0311 13:01:56.131382 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:01:56 crc kubenswrapper[4816]: E0311 13:01:56.133933 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.178746 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:02:00 crc kubenswrapper[4816]: E0311 13:02:00.179858 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f461fc9a-2ced-499e-a8a3-ab129c298ea7" containerName="collect-profiles" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.179883 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f461fc9a-2ced-499e-a8a3-ab129c298ea7" containerName="collect-profiles" Mar 11 13:02:00 crc kubenswrapper[4816]: E0311 13:02:00.179917 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d463a78-830d-4b86-830a-e70345993927" containerName="oc" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.179925 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d463a78-830d-4b86-830a-e70345993927" containerName="oc" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.180100 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f461fc9a-2ced-499e-a8a3-ab129c298ea7" containerName="collect-profiles" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.180117 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d463a78-830d-4b86-830a-e70345993927" containerName="oc" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.180808 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.183948 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.183954 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.185701 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.211081 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.252268 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") pod \"auto-csr-approver-29553902-lffcf\" (UID: \"6a325766-41a7-415f-88ad-698627f015c1\") " pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.354548 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") pod \"auto-csr-approver-29553902-lffcf\" (UID: \"6a325766-41a7-415f-88ad-698627f015c1\") " pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.374494 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") pod \"auto-csr-approver-29553902-lffcf\" (UID: \"6a325766-41a7-415f-88ad-698627f015c1\") " pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:00 crc kubenswrapper[4816]: I0311 13:02:00.507316 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:01 crc kubenswrapper[4816]: I0311 13:02:01.003347 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:02:01 crc kubenswrapper[4816]: I0311 13:02:01.525704 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553902-lffcf" event={"ID":"6a325766-41a7-415f-88ad-698627f015c1","Type":"ContainerStarted","Data":"9ea757d1b65436ff49ee171bceb32375f57ad532f35a8fdb81e882f520694379"} Mar 11 13:02:02 crc kubenswrapper[4816]: I0311 13:02:02.539084 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553902-lffcf" event={"ID":"6a325766-41a7-415f-88ad-698627f015c1","Type":"ContainerStarted","Data":"6b272c0cd2cf4fb57145d2f34bc9f76d7316747da7af06ee61d93b20ed09cce5"} Mar 11 13:02:03 crc kubenswrapper[4816]: I0311 13:02:03.555456 4816 generic.go:334] "Generic (PLEG): container finished" podID="6a325766-41a7-415f-88ad-698627f015c1" containerID="6b272c0cd2cf4fb57145d2f34bc9f76d7316747da7af06ee61d93b20ed09cce5" exitCode=0 Mar 11 13:02:03 crc kubenswrapper[4816]: I0311 13:02:03.555555 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553902-lffcf" event={"ID":"6a325766-41a7-415f-88ad-698627f015c1","Type":"ContainerDied","Data":"6b272c0cd2cf4fb57145d2f34bc9f76d7316747da7af06ee61d93b20ed09cce5"} Mar 11 13:02:03 crc kubenswrapper[4816]: I0311 13:02:03.997333 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.125486 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") pod \"6a325766-41a7-415f-88ad-698627f015c1\" (UID: \"6a325766-41a7-415f-88ad-698627f015c1\") " Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.137393 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc" (OuterVolumeSpecName: "kube-api-access-zndsc") pod "6a325766-41a7-415f-88ad-698627f015c1" (UID: "6a325766-41a7-415f-88ad-698627f015c1"). InnerVolumeSpecName "kube-api-access-zndsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.227393 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zndsc\" (UniqueName: \"kubernetes.io/projected/6a325766-41a7-415f-88ad-698627f015c1-kube-api-access-zndsc\") on node \"crc\" DevicePath \"\"" Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.568063 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553902-lffcf" event={"ID":"6a325766-41a7-415f-88ad-698627f015c1","Type":"ContainerDied","Data":"9ea757d1b65436ff49ee171bceb32375f57ad532f35a8fdb81e882f520694379"} Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.568131 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea757d1b65436ff49ee171bceb32375f57ad532f35a8fdb81e882f520694379" Mar 11 13:02:04 crc kubenswrapper[4816]: I0311 13:02:04.568132 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553902-lffcf" Mar 11 13:02:05 crc kubenswrapper[4816]: I0311 13:02:05.088802 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 13:02:05 crc kubenswrapper[4816]: I0311 13:02:05.097275 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553896-lxt87"] Mar 11 13:02:06 crc kubenswrapper[4816]: I0311 13:02:06.141048 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2438ebe2-3bab-42fc-9430-8b2600a2efd1" path="/var/lib/kubelet/pods/2438ebe2-3bab-42fc-9430-8b2600a2efd1/volumes" Mar 11 13:02:07 crc kubenswrapper[4816]: I0311 13:02:07.130287 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:07 crc kubenswrapper[4816]: E0311 13:02:07.130702 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:21 crc kubenswrapper[4816]: I0311 13:02:21.131136 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:21 crc kubenswrapper[4816]: E0311 13:02:21.132479 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:34 crc kubenswrapper[4816]: I0311 13:02:34.135972 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:34 crc kubenswrapper[4816]: E0311 13:02:34.138971 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:46 crc kubenswrapper[4816]: I0311 13:02:46.131156 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:46 crc kubenswrapper[4816]: E0311 13:02:46.132593 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:02:49 crc kubenswrapper[4816]: I0311 13:02:49.302225 4816 scope.go:117] "RemoveContainer" containerID="78e7b5f4d85a6eb5009a8b4bbf6ee9389d9e1a205f3bf787bb243af2af671b70" Mar 11 13:02:57 crc kubenswrapper[4816]: I0311 13:02:57.137057 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:02:57 crc kubenswrapper[4816]: E0311 13:02:57.137880 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:03:09 crc kubenswrapper[4816]: I0311 13:03:09.130513 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:03:09 crc kubenswrapper[4816]: E0311 13:03:09.131910 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:03:23 crc kubenswrapper[4816]: I0311 13:03:23.130781 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:03:23 crc kubenswrapper[4816]: E0311 13:03:23.132293 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.105572 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:35 crc kubenswrapper[4816]: E0311 13:03:35.106583 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a325766-41a7-415f-88ad-698627f015c1" containerName="oc" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.106596 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a325766-41a7-415f-88ad-698627f015c1" containerName="oc" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.106801 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a325766-41a7-415f-88ad-698627f015c1" containerName="oc" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.108811 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.129909 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.279567 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.279639 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.279896 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.381750 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.381859 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.381902 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.382509 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.382569 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.403614 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") pod \"certified-operators-hmmvj\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.452423 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:35 crc kubenswrapper[4816]: I0311 13:03:35.944294 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:36 crc kubenswrapper[4816]: I0311 13:03:36.490744 4816 generic.go:334] "Generic (PLEG): container finished" podID="81dece66-cc39-4d85-b338-fe3626c87bff" containerID="8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43" exitCode=0 Mar 11 13:03:36 crc kubenswrapper[4816]: I0311 13:03:36.490865 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerDied","Data":"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43"} Mar 11 13:03:36 crc kubenswrapper[4816]: I0311 13:03:36.493336 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerStarted","Data":"9deb2f31d1966f9a8f198c744b69afa0993e2b39aacd3fe0626d5fd630fed132"} Mar 11 13:03:36 crc kubenswrapper[4816]: I0311 13:03:36.493996 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:03:38 crc kubenswrapper[4816]: I0311 13:03:38.131582 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:03:38 crc kubenswrapper[4816]: E0311 13:03:38.132225 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:03:38 crc kubenswrapper[4816]: I0311 13:03:38.525127 4816 generic.go:334] "Generic (PLEG): container finished" podID="81dece66-cc39-4d85-b338-fe3626c87bff" containerID="da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb" exitCode=0 Mar 11 13:03:38 crc kubenswrapper[4816]: I0311 13:03:38.525267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerDied","Data":"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb"} Mar 11 13:03:39 crc kubenswrapper[4816]: I0311 13:03:39.538663 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerStarted","Data":"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972"} Mar 11 13:03:39 crc kubenswrapper[4816]: I0311 13:03:39.569790 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hmmvj" podStartSLOduration=1.934257251 podStartE2EDuration="4.569766334s" podCreationTimestamp="2026-03-11 13:03:35 +0000 UTC" firstStartedPulling="2026-03-11 13:03:36.493425315 +0000 UTC m=+3903.084689322" lastFinishedPulling="2026-03-11 13:03:39.128934398 +0000 UTC m=+3905.720198405" observedRunningTime="2026-03-11 13:03:39.56821444 +0000 UTC m=+3906.159478447" watchObservedRunningTime="2026-03-11 13:03:39.569766334 +0000 UTC m=+3906.161030321" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.453137 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.454449 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.534347 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.673607 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:45 crc kubenswrapper[4816]: I0311 13:03:45.787285 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:47 crc kubenswrapper[4816]: I0311 13:03:47.639455 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hmmvj" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="registry-server" containerID="cri-o://dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" gracePeriod=2 Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.218686 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.236928 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") pod \"81dece66-cc39-4d85-b338-fe3626c87bff\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.237057 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") pod \"81dece66-cc39-4d85-b338-fe3626c87bff\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.237223 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") pod \"81dece66-cc39-4d85-b338-fe3626c87bff\" (UID: \"81dece66-cc39-4d85-b338-fe3626c87bff\") " Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.238332 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities" (OuterVolumeSpecName: "utilities") pod "81dece66-cc39-4d85-b338-fe3626c87bff" (UID: "81dece66-cc39-4d85-b338-fe3626c87bff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.251742 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8" (OuterVolumeSpecName: "kube-api-access-vtgx8") pod "81dece66-cc39-4d85-b338-fe3626c87bff" (UID: "81dece66-cc39-4d85-b338-fe3626c87bff"). InnerVolumeSpecName "kube-api-access-vtgx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.339350 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.339388 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtgx8\" (UniqueName: \"kubernetes.io/projected/81dece66-cc39-4d85-b338-fe3626c87bff-kube-api-access-vtgx8\") on node \"crc\" DevicePath \"\"" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.339541 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81dece66-cc39-4d85-b338-fe3626c87bff" (UID: "81dece66-cc39-4d85-b338-fe3626c87bff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.441493 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81dece66-cc39-4d85-b338-fe3626c87bff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654539 4816 generic.go:334] "Generic (PLEG): container finished" podID="81dece66-cc39-4d85-b338-fe3626c87bff" containerID="dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" exitCode=0 Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654634 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerDied","Data":"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972"} Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654660 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmmvj" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654694 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmmvj" event={"ID":"81dece66-cc39-4d85-b338-fe3626c87bff","Type":"ContainerDied","Data":"9deb2f31d1966f9a8f198c744b69afa0993e2b39aacd3fe0626d5fd630fed132"} Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.654739 4816 scope.go:117] "RemoveContainer" containerID="dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.702614 4816 scope.go:117] "RemoveContainer" containerID="da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.729028 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.734961 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hmmvj"] Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.739921 4816 scope.go:117] "RemoveContainer" containerID="8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.767311 4816 scope.go:117] "RemoveContainer" containerID="dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" Mar 11 13:03:48 crc kubenswrapper[4816]: E0311 13:03:48.768100 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972\": container with ID starting with dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972 not found: ID does not exist" containerID="dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.768206 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972"} err="failed to get container status \"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972\": rpc error: code = NotFound desc = could not find container \"dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972\": container with ID starting with dacfc282d3d9135418b829d24b97f079dd5415eca33df5c283dd29675a497972 not found: ID does not exist" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.768292 4816 scope.go:117] "RemoveContainer" containerID="da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb" Mar 11 13:03:48 crc kubenswrapper[4816]: E0311 13:03:48.768998 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb\": container with ID starting with da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb not found: ID does not exist" containerID="da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.769063 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb"} err="failed to get container status \"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb\": rpc error: code = NotFound desc = could not find container \"da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb\": container with ID starting with da8bb1d584a6b0c6a8a96827e7082c99fd714a201ca1d6a4a5d5b03b0ad00acb not found: ID does not exist" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.769114 4816 scope.go:117] "RemoveContainer" containerID="8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43" Mar 11 13:03:48 crc kubenswrapper[4816]: E0311 13:03:48.769648 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43\": container with ID starting with 8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43 not found: ID does not exist" containerID="8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43" Mar 11 13:03:48 crc kubenswrapper[4816]: I0311 13:03:48.769682 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43"} err="failed to get container status \"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43\": rpc error: code = NotFound desc = could not find container \"8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43\": container with ID starting with 8a34047447ea78c8a4031c4f7ae65781dc927463fd6ea54c340f3538cea03d43 not found: ID does not exist" Mar 11 13:03:50 crc kubenswrapper[4816]: I0311 13:03:50.149105 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" path="/var/lib/kubelet/pods/81dece66-cc39-4d85-b338-fe3626c87bff/volumes" Mar 11 13:03:52 crc kubenswrapper[4816]: I0311 13:03:52.131705 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:03:52 crc kubenswrapper[4816]: E0311 13:03:52.132591 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.184766 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:04:00 crc kubenswrapper[4816]: E0311 13:04:00.186411 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="extract-utilities" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.186440 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="extract-utilities" Mar 11 13:04:00 crc kubenswrapper[4816]: E0311 13:04:00.186482 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="registry-server" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.186498 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="registry-server" Mar 11 13:04:00 crc kubenswrapper[4816]: E0311 13:04:00.186541 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="extract-content" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.186556 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="extract-content" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.186826 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="81dece66-cc39-4d85-b338-fe3626c87bff" containerName="registry-server" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.187758 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.191399 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.191493 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.192954 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.199678 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.259296 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") pod \"auto-csr-approver-29553904-2bx7k\" (UID: \"f176ec9f-47de-4710-a5f3-078403bb4bfb\") " pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.364417 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") pod \"auto-csr-approver-29553904-2bx7k\" (UID: \"f176ec9f-47de-4710-a5f3-078403bb4bfb\") " pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.400353 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") pod \"auto-csr-approver-29553904-2bx7k\" (UID: \"f176ec9f-47de-4710-a5f3-078403bb4bfb\") " pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:00 crc kubenswrapper[4816]: I0311 13:04:00.522916 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:01 crc kubenswrapper[4816]: I0311 13:04:01.136537 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:04:01 crc kubenswrapper[4816]: I0311 13:04:01.799741 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" event={"ID":"f176ec9f-47de-4710-a5f3-078403bb4bfb","Type":"ContainerStarted","Data":"bf633af1b6ecc571dcc5fcee156c69d430901deaca33b2db83e0d83c262bd147"} Mar 11 13:04:02 crc kubenswrapper[4816]: I0311 13:04:02.811113 4816 generic.go:334] "Generic (PLEG): container finished" podID="f176ec9f-47de-4710-a5f3-078403bb4bfb" containerID="c0c120d96d0731c58ebb4a66094eed03724800f299fa6a22258f239a945115e0" exitCode=0 Mar 11 13:04:02 crc kubenswrapper[4816]: I0311 13:04:02.811329 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" event={"ID":"f176ec9f-47de-4710-a5f3-078403bb4bfb","Type":"ContainerDied","Data":"c0c120d96d0731c58ebb4a66094eed03724800f299fa6a22258f239a945115e0"} Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.287958 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.440089 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") pod \"f176ec9f-47de-4710-a5f3-078403bb4bfb\" (UID: \"f176ec9f-47de-4710-a5f3-078403bb4bfb\") " Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.450369 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d" (OuterVolumeSpecName: "kube-api-access-tkl7d") pod "f176ec9f-47de-4710-a5f3-078403bb4bfb" (UID: "f176ec9f-47de-4710-a5f3-078403bb4bfb"). InnerVolumeSpecName "kube-api-access-tkl7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.542458 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkl7d\" (UniqueName: \"kubernetes.io/projected/f176ec9f-47de-4710-a5f3-078403bb4bfb-kube-api-access-tkl7d\") on node \"crc\" DevicePath \"\"" Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.839429 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" event={"ID":"f176ec9f-47de-4710-a5f3-078403bb4bfb","Type":"ContainerDied","Data":"bf633af1b6ecc571dcc5fcee156c69d430901deaca33b2db83e0d83c262bd147"} Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.839506 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf633af1b6ecc571dcc5fcee156c69d430901deaca33b2db83e0d83c262bd147" Mar 11 13:04:04 crc kubenswrapper[4816]: I0311 13:04:04.839607 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553904-2bx7k" Mar 11 13:04:05 crc kubenswrapper[4816]: I0311 13:04:05.385746 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 13:04:05 crc kubenswrapper[4816]: I0311 13:04:05.397330 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553898-tcpck"] Mar 11 13:04:06 crc kubenswrapper[4816]: I0311 13:04:06.131188 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:04:06 crc kubenswrapper[4816]: E0311 13:04:06.131788 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:04:06 crc kubenswrapper[4816]: I0311 13:04:06.154954 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f75061-6a64-4c1d-b9f9-77f6425ad4c5" path="/var/lib/kubelet/pods/30f75061-6a64-4c1d-b9f9-77f6425ad4c5/volumes" Mar 11 13:04:18 crc kubenswrapper[4816]: I0311 13:04:18.130631 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:04:18 crc kubenswrapper[4816]: E0311 13:04:18.131642 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:04:33 crc kubenswrapper[4816]: I0311 13:04:33.132345 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:04:33 crc kubenswrapper[4816]: E0311 13:04:33.133631 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:04:47 crc kubenswrapper[4816]: I0311 13:04:47.130406 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:04:48 crc kubenswrapper[4816]: I0311 13:04:48.294015 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f"} Mar 11 13:04:49 crc kubenswrapper[4816]: I0311 13:04:49.419846 4816 scope.go:117] "RemoveContainer" containerID="0c0de876588cbf0205a01555bac817ffc9ad65f6cabe6192282136fce8802326" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.508729 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:04:51 crc kubenswrapper[4816]: E0311 13:04:51.512181 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f176ec9f-47de-4710-a5f3-078403bb4bfb" containerName="oc" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.512387 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f176ec9f-47de-4710-a5f3-078403bb4bfb" containerName="oc" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.512777 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f176ec9f-47de-4710-a5f3-078403bb4bfb" containerName="oc" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.514808 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.528986 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.553232 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.553358 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.553393 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.657351 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.657447 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.657685 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.658018 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.658639 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.680508 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") pod \"redhat-operators-jtv9r\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:51 crc kubenswrapper[4816]: I0311 13:04:51.853378 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:04:52 crc kubenswrapper[4816]: I0311 13:04:52.116601 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:04:53 crc kubenswrapper[4816]: I0311 13:04:53.344691 4816 generic.go:334] "Generic (PLEG): container finished" podID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerID="c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36" exitCode=0 Mar 11 13:04:53 crc kubenswrapper[4816]: I0311 13:04:53.344770 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerDied","Data":"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36"} Mar 11 13:04:53 crc kubenswrapper[4816]: I0311 13:04:53.345175 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerStarted","Data":"7dadfe53701c500277eb8a550c826e130375f4f58cab5a05c724a1695adb23b2"} Mar 11 13:04:55 crc kubenswrapper[4816]: I0311 13:04:55.370968 4816 generic.go:334] "Generic (PLEG): container finished" podID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerID="d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569" exitCode=0 Mar 11 13:04:55 crc kubenswrapper[4816]: I0311 13:04:55.371106 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerDied","Data":"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569"} Mar 11 13:04:57 crc kubenswrapper[4816]: I0311 13:04:57.402058 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerStarted","Data":"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7"} Mar 11 13:04:57 crc kubenswrapper[4816]: I0311 13:04:57.436384 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtv9r" podStartSLOduration=3.170639878 podStartE2EDuration="6.436358033s" podCreationTimestamp="2026-03-11 13:04:51 +0000 UTC" firstStartedPulling="2026-03-11 13:04:53.348159539 +0000 UTC m=+3979.939423506" lastFinishedPulling="2026-03-11 13:04:56.613877664 +0000 UTC m=+3983.205141661" observedRunningTime="2026-03-11 13:04:57.431103372 +0000 UTC m=+3984.022367379" watchObservedRunningTime="2026-03-11 13:04:57.436358033 +0000 UTC m=+3984.027622040" Mar 11 13:05:01 crc kubenswrapper[4816]: I0311 13:05:01.854097 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:01 crc kubenswrapper[4816]: I0311 13:05:01.854919 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:02 crc kubenswrapper[4816]: I0311 13:05:02.919965 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtv9r" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" probeResult="failure" output=< Mar 11 13:05:02 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 13:05:02 crc kubenswrapper[4816]: > Mar 11 13:05:11 crc kubenswrapper[4816]: I0311 13:05:11.942985 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:12 crc kubenswrapper[4816]: I0311 13:05:12.019086 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:12 crc kubenswrapper[4816]: I0311 13:05:12.202407 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:05:13 crc kubenswrapper[4816]: I0311 13:05:13.565150 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtv9r" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" containerID="cri-o://6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" gracePeriod=2 Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.102930 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.188377 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") pod \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.188571 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") pod \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.188744 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") pod \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\" (UID: \"52bf5aed-b67d-4fe6-8564-5756e640aa5d\") " Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.190264 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities" (OuterVolumeSpecName: "utilities") pod "52bf5aed-b67d-4fe6-8564-5756e640aa5d" (UID: "52bf5aed-b67d-4fe6-8564-5756e640aa5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.200464 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc" (OuterVolumeSpecName: "kube-api-access-xjzsc") pod "52bf5aed-b67d-4fe6-8564-5756e640aa5d" (UID: "52bf5aed-b67d-4fe6-8564-5756e640aa5d"). InnerVolumeSpecName "kube-api-access-xjzsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.292095 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.292189 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjzsc\" (UniqueName: \"kubernetes.io/projected/52bf5aed-b67d-4fe6-8564-5756e640aa5d-kube-api-access-xjzsc\") on node \"crc\" DevicePath \"\"" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.350445 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52bf5aed-b67d-4fe6-8564-5756e640aa5d" (UID: "52bf5aed-b67d-4fe6-8564-5756e640aa5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.394220 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52bf5aed-b67d-4fe6-8564-5756e640aa5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.577268 4816 generic.go:334] "Generic (PLEG): container finished" podID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerID="6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" exitCode=0 Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.577284 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerDied","Data":"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7"} Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.578330 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtv9r" event={"ID":"52bf5aed-b67d-4fe6-8564-5756e640aa5d","Type":"ContainerDied","Data":"7dadfe53701c500277eb8a550c826e130375f4f58cab5a05c724a1695adb23b2"} Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.577411 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtv9r" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.578398 4816 scope.go:117] "RemoveContainer" containerID="6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.608081 4816 scope.go:117] "RemoveContainer" containerID="d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.620394 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.636023 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtv9r"] Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.645570 4816 scope.go:117] "RemoveContainer" containerID="c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.684012 4816 scope.go:117] "RemoveContainer" containerID="6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" Mar 11 13:05:14 crc kubenswrapper[4816]: E0311 13:05:14.684722 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7\": container with ID starting with 6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7 not found: ID does not exist" containerID="6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.684805 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7"} err="failed to get container status \"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7\": rpc error: code = NotFound desc = could not find container \"6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7\": container with ID starting with 6cdfd849eb2bd6ad5fc15276a0cd1417dcdbfb90ed4dda4160f33d2166307fd7 not found: ID does not exist" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.684848 4816 scope.go:117] "RemoveContainer" containerID="d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569" Mar 11 13:05:14 crc kubenswrapper[4816]: E0311 13:05:14.685800 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569\": container with ID starting with d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569 not found: ID does not exist" containerID="d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.685839 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569"} err="failed to get container status \"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569\": rpc error: code = NotFound desc = could not find container \"d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569\": container with ID starting with d73d66abd522eb4e8ef558f84ce5c9a47ba7b97c75d5b7458411c4ff70e78569 not found: ID does not exist" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.685864 4816 scope.go:117] "RemoveContainer" containerID="c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36" Mar 11 13:05:14 crc kubenswrapper[4816]: E0311 13:05:14.686166 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36\": container with ID starting with c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36 not found: ID does not exist" containerID="c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36" Mar 11 13:05:14 crc kubenswrapper[4816]: I0311 13:05:14.686195 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36"} err="failed to get container status \"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36\": rpc error: code = NotFound desc = could not find container \"c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36\": container with ID starting with c3b94d11cef2e239aec27c0d91fa2b19432a2a1cb9bf4dab3a136ed8f9e9ea36 not found: ID does not exist" Mar 11 13:05:16 crc kubenswrapper[4816]: I0311 13:05:16.154105 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" path="/var/lib/kubelet/pods/52bf5aed-b67d-4fe6-8564-5756e640aa5d/volumes" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.164135 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:06:00 crc kubenswrapper[4816]: E0311 13:06:00.165091 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="extract-content" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.165107 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="extract-content" Mar 11 13:06:00 crc kubenswrapper[4816]: E0311 13:06:00.165137 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.165144 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" Mar 11 13:06:00 crc kubenswrapper[4816]: E0311 13:06:00.165159 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="extract-utilities" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.165166 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="extract-utilities" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.165332 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bf5aed-b67d-4fe6-8564-5756e640aa5d" containerName="registry-server" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.166020 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.169184 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.169597 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.169773 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.170966 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.253152 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") pod \"auto-csr-approver-29553906-4wxn9\" (UID: \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\") " pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.354454 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") pod \"auto-csr-approver-29553906-4wxn9\" (UID: \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\") " pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.385468 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") pod \"auto-csr-approver-29553906-4wxn9\" (UID: \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\") " pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.515046 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:00 crc kubenswrapper[4816]: I0311 13:06:00.813631 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:06:01 crc kubenswrapper[4816]: I0311 13:06:01.647146 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" event={"ID":"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf","Type":"ContainerStarted","Data":"3e7f4363ecc635bb29a64b19cb4b42e426db73749bd9c0629fc81823afe7412f"} Mar 11 13:06:02 crc kubenswrapper[4816]: I0311 13:06:02.656496 4816 generic.go:334] "Generic (PLEG): container finished" podID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" containerID="3049a692892071c6574e8ee18347abb47ed4c1ed532d21f9dde8bcb07555460f" exitCode=0 Mar 11 13:06:02 crc kubenswrapper[4816]: I0311 13:06:02.656572 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" event={"ID":"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf","Type":"ContainerDied","Data":"3049a692892071c6574e8ee18347abb47ed4c1ed532d21f9dde8bcb07555460f"} Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.009110 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.116444 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") pod \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\" (UID: \"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf\") " Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.126719 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2" (OuterVolumeSpecName: "kube-api-access-64vx2") pod "4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" (UID: "4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf"). InnerVolumeSpecName "kube-api-access-64vx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.219686 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64vx2\" (UniqueName: \"kubernetes.io/projected/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf-kube-api-access-64vx2\") on node \"crc\" DevicePath \"\"" Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.676687 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" event={"ID":"4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf","Type":"ContainerDied","Data":"3e7f4363ecc635bb29a64b19cb4b42e426db73749bd9c0629fc81823afe7412f"} Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.676744 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e7f4363ecc635bb29a64b19cb4b42e426db73749bd9c0629fc81823afe7412f" Mar 11 13:06:04 crc kubenswrapper[4816]: I0311 13:06:04.676816 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553906-4wxn9" Mar 11 13:06:05 crc kubenswrapper[4816]: I0311 13:06:05.120552 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:06:05 crc kubenswrapper[4816]: I0311 13:06:05.133093 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553900-h76nw"] Mar 11 13:06:06 crc kubenswrapper[4816]: I0311 13:06:06.141844 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d463a78-830d-4b86-830a-e70345993927" path="/var/lib/kubelet/pods/9d463a78-830d-4b86-830a-e70345993927/volumes" Mar 11 13:06:49 crc kubenswrapper[4816]: I0311 13:06:49.547770 4816 scope.go:117] "RemoveContainer" containerID="02f79ceb28719ec9aa00f051068012e5f7850ccf8b02f5d8f4ecbb73c01a94f5" Mar 11 13:07:09 crc kubenswrapper[4816]: I0311 13:07:09.514895 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:07:09 crc kubenswrapper[4816]: I0311 13:07:09.515807 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:07:39 crc kubenswrapper[4816]: I0311 13:07:39.515339 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:07:39 crc kubenswrapper[4816]: I0311 13:07:39.516279 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.714914 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:42 crc kubenswrapper[4816]: E0311 13:07:42.716232 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" containerName="oc" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.716269 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" containerName="oc" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.716650 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" containerName="oc" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.719182 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.728065 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.815925 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.816001 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.817648 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.919495 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.919591 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.919642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.920474 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.920477 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:42 crc kubenswrapper[4816]: I0311 13:07:42.942555 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") pod \"community-operators-sxvrq\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:43 crc kubenswrapper[4816]: I0311 13:07:43.061744 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:43 crc kubenswrapper[4816]: I0311 13:07:43.465978 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:43 crc kubenswrapper[4816]: I0311 13:07:43.647371 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerStarted","Data":"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3"} Mar 11 13:07:43 crc kubenswrapper[4816]: I0311 13:07:43.647968 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerStarted","Data":"77d681d03c90044ec1e22b8547e0a03ed16a8ae23c9744a79518e8bb6fc6b1ff"} Mar 11 13:07:44 crc kubenswrapper[4816]: I0311 13:07:44.669885 4816 generic.go:334] "Generic (PLEG): container finished" podID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerID="cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3" exitCode=0 Mar 11 13:07:44 crc kubenswrapper[4816]: I0311 13:07:44.669956 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerDied","Data":"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3"} Mar 11 13:07:46 crc kubenswrapper[4816]: I0311 13:07:46.698469 4816 generic.go:334] "Generic (PLEG): container finished" podID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerID="a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869" exitCode=0 Mar 11 13:07:46 crc kubenswrapper[4816]: I0311 13:07:46.698549 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerDied","Data":"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869"} Mar 11 13:07:47 crc kubenswrapper[4816]: I0311 13:07:47.711276 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerStarted","Data":"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6"} Mar 11 13:07:47 crc kubenswrapper[4816]: I0311 13:07:47.749115 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sxvrq" podStartSLOduration=3.279799417 podStartE2EDuration="5.749080258s" podCreationTimestamp="2026-03-11 13:07:42 +0000 UTC" firstStartedPulling="2026-03-11 13:07:44.675092027 +0000 UTC m=+4151.266356034" lastFinishedPulling="2026-03-11 13:07:47.144372868 +0000 UTC m=+4153.735636875" observedRunningTime="2026-03-11 13:07:47.738561598 +0000 UTC m=+4154.329825595" watchObservedRunningTime="2026-03-11 13:07:47.749080258 +0000 UTC m=+4154.340344235" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.061930 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.063000 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.114436 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.824087 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:53 crc kubenswrapper[4816]: I0311 13:07:53.895367 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:55 crc kubenswrapper[4816]: I0311 13:07:55.786694 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sxvrq" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="registry-server" containerID="cri-o://b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" gracePeriod=2 Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.282347 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.450519 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") pod \"eff46de2-0d75-4bb7-a269-d703a8621c8e\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.450587 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") pod \"eff46de2-0d75-4bb7-a269-d703a8621c8e\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.450715 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") pod \"eff46de2-0d75-4bb7-a269-d703a8621c8e\" (UID: \"eff46de2-0d75-4bb7-a269-d703a8621c8e\") " Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.451732 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities" (OuterVolumeSpecName: "utilities") pod "eff46de2-0d75-4bb7-a269-d703a8621c8e" (UID: "eff46de2-0d75-4bb7-a269-d703a8621c8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.459122 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq" (OuterVolumeSpecName: "kube-api-access-vxqlq") pod "eff46de2-0d75-4bb7-a269-d703a8621c8e" (UID: "eff46de2-0d75-4bb7-a269-d703a8621c8e"). InnerVolumeSpecName "kube-api-access-vxqlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.553339 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxqlq\" (UniqueName: \"kubernetes.io/projected/eff46de2-0d75-4bb7-a269-d703a8621c8e-kube-api-access-vxqlq\") on node \"crc\" DevicePath \"\"" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.553838 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.746535 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eff46de2-0d75-4bb7-a269-d703a8621c8e" (UID: "eff46de2-0d75-4bb7-a269-d703a8621c8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.757478 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff46de2-0d75-4bb7-a269-d703a8621c8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.803467 4816 generic.go:334] "Generic (PLEG): container finished" podID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerID="b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" exitCode=0 Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.803619 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxvrq" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.803618 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerDied","Data":"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6"} Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.805484 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxvrq" event={"ID":"eff46de2-0d75-4bb7-a269-d703a8621c8e","Type":"ContainerDied","Data":"77d681d03c90044ec1e22b8547e0a03ed16a8ae23c9744a79518e8bb6fc6b1ff"} Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.805540 4816 scope.go:117] "RemoveContainer" containerID="b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.845400 4816 scope.go:117] "RemoveContainer" containerID="a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.881513 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.891323 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sxvrq"] Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.894293 4816 scope.go:117] "RemoveContainer" containerID="cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.924522 4816 scope.go:117] "RemoveContainer" containerID="b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" Mar 11 13:07:56 crc kubenswrapper[4816]: E0311 13:07:56.925803 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6\": container with ID starting with b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6 not found: ID does not exist" containerID="b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.925849 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6"} err="failed to get container status \"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6\": rpc error: code = NotFound desc = could not find container \"b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6\": container with ID starting with b0c2bd7ab4e927fad91dd806abc61894a6708832ad4f1fbbfdfaf08bcb49b3a6 not found: ID does not exist" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.925884 4816 scope.go:117] "RemoveContainer" containerID="a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869" Mar 11 13:07:56 crc kubenswrapper[4816]: E0311 13:07:56.926658 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869\": container with ID starting with a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869 not found: ID does not exist" containerID="a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.926736 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869"} err="failed to get container status \"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869\": rpc error: code = NotFound desc = could not find container \"a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869\": container with ID starting with a5f903c936c84171813a5fa2e6d049fb8022544e414bfe1fd09a9d584785d869 not found: ID does not exist" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.926790 4816 scope.go:117] "RemoveContainer" containerID="cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3" Mar 11 13:07:56 crc kubenswrapper[4816]: E0311 13:07:56.927471 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3\": container with ID starting with cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3 not found: ID does not exist" containerID="cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3" Mar 11 13:07:56 crc kubenswrapper[4816]: I0311 13:07:56.927512 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3"} err="failed to get container status \"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3\": rpc error: code = NotFound desc = could not find container \"cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3\": container with ID starting with cad263e674abfaddb7974f4e9fbd5b072dec296365d06831a0d43b45bf2cbfe3 not found: ID does not exist" Mar 11 13:07:58 crc kubenswrapper[4816]: I0311 13:07:58.148934 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" path="/var/lib/kubelet/pods/eff46de2-0d75-4bb7-a269-d703a8621c8e/volumes" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.173930 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:08:00 crc kubenswrapper[4816]: E0311 13:08:00.174985 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="extract-utilities" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.175011 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="extract-utilities" Mar 11 13:08:00 crc kubenswrapper[4816]: E0311 13:08:00.175039 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="registry-server" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.175050 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="registry-server" Mar 11 13:08:00 crc kubenswrapper[4816]: E0311 13:08:00.175071 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="extract-content" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.175083 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="extract-content" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.175369 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff46de2-0d75-4bb7-a269-d703a8621c8e" containerName="registry-server" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.176174 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.181996 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.182310 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.182476 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.182124 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.326014 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") pod \"auto-csr-approver-29553908-7sdtd\" (UID: \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\") " pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.428172 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") pod \"auto-csr-approver-29553908-7sdtd\" (UID: \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\") " pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.459970 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") pod \"auto-csr-approver-29553908-7sdtd\" (UID: \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\") " pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.521732 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.801495 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:08:00 crc kubenswrapper[4816]: W0311 13:08:00.812609 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dc717b0_ae5a_46c4_9fea_48dcab17a9c8.slice/crio-4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d WatchSource:0}: Error finding container 4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d: Status 404 returned error can't find the container with id 4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d Mar 11 13:08:00 crc kubenswrapper[4816]: I0311 13:08:00.855459 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" event={"ID":"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8","Type":"ContainerStarted","Data":"4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d"} Mar 11 13:08:02 crc kubenswrapper[4816]: I0311 13:08:02.878633 4816 generic.go:334] "Generic (PLEG): container finished" podID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" containerID="8f956a9fd47ed00082f55e7f0e6d344e63382a72a835af03fd720051a5e8b801" exitCode=0 Mar 11 13:08:02 crc kubenswrapper[4816]: I0311 13:08:02.878729 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" event={"ID":"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8","Type":"ContainerDied","Data":"8f956a9fd47ed00082f55e7f0e6d344e63382a72a835af03fd720051a5e8b801"} Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.352497 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.505805 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") pod \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\" (UID: \"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8\") " Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.514332 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv" (OuterVolumeSpecName: "kube-api-access-sbmvv") pod "9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" (UID: "9dc717b0-ae5a-46c4-9fea-48dcab17a9c8"). InnerVolumeSpecName "kube-api-access-sbmvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.608699 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbmvv\" (UniqueName: \"kubernetes.io/projected/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8-kube-api-access-sbmvv\") on node \"crc\" DevicePath \"\"" Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.904340 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" event={"ID":"9dc717b0-ae5a-46c4-9fea-48dcab17a9c8","Type":"ContainerDied","Data":"4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d"} Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.904406 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b2a804e3091f1bc1028213099574573fabfe400673c93ed97d0c58eed843d2d" Mar 11 13:08:04 crc kubenswrapper[4816]: I0311 13:08:04.904461 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553908-7sdtd" Mar 11 13:08:05 crc kubenswrapper[4816]: I0311 13:08:05.464172 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:08:05 crc kubenswrapper[4816]: I0311 13:08:05.476182 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553902-lffcf"] Mar 11 13:08:06 crc kubenswrapper[4816]: I0311 13:08:06.143120 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a325766-41a7-415f-88ad-698627f015c1" path="/var/lib/kubelet/pods/6a325766-41a7-415f-88ad-698627f015c1/volumes" Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.515376 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.517314 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.517543 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.518724 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 13:08:09 crc kubenswrapper[4816]: I0311 13:08:09.519163 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f" gracePeriod=600 Mar 11 13:08:10 crc kubenswrapper[4816]: I0311 13:08:10.002341 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f" exitCode=0 Mar 11 13:08:10 crc kubenswrapper[4816]: I0311 13:08:10.002465 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f"} Mar 11 13:08:10 crc kubenswrapper[4816]: I0311 13:08:10.002978 4816 scope.go:117] "RemoveContainer" containerID="64abbd1ae7fc66fc92a0c249522d883f3abbf1e20c434283b17e9756d41408a5" Mar 11 13:08:11 crc kubenswrapper[4816]: I0311 13:08:11.022300 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e"} Mar 11 13:08:49 crc kubenswrapper[4816]: I0311 13:08:49.655394 4816 scope.go:117] "RemoveContainer" containerID="6b272c0cd2cf4fb57145d2f34bc9f76d7316747da7af06ee61d93b20ed09cce5" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.971867 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:06 crc kubenswrapper[4816]: E0311 13:09:06.972959 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" containerName="oc" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.972977 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" containerName="oc" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.973172 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" containerName="oc" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.974604 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:06 crc kubenswrapper[4816]: I0311 13:09:06.989850 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.175749 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.175846 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.175880 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.277683 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.277809 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.277863 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.278679 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.278738 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.302704 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") pod \"redhat-marketplace-4vkqz\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.313776 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:07 crc kubenswrapper[4816]: I0311 13:09:07.775992 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:08 crc kubenswrapper[4816]: I0311 13:09:08.666500 4816 generic.go:334] "Generic (PLEG): container finished" podID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerID="e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7" exitCode=0 Mar 11 13:09:08 crc kubenswrapper[4816]: I0311 13:09:08.666606 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerDied","Data":"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7"} Mar 11 13:09:08 crc kubenswrapper[4816]: I0311 13:09:08.666986 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerStarted","Data":"dfd5f229efde5cc7964f1d4a40897e75908c215ed7813b1fd48bb7c0b8cb6acb"} Mar 11 13:09:08 crc kubenswrapper[4816]: I0311 13:09:08.670968 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:09:09 crc kubenswrapper[4816]: I0311 13:09:09.679517 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerStarted","Data":"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a"} Mar 11 13:09:10 crc kubenswrapper[4816]: I0311 13:09:10.710564 4816 generic.go:334] "Generic (PLEG): container finished" podID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerID="3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a" exitCode=0 Mar 11 13:09:10 crc kubenswrapper[4816]: I0311 13:09:10.710661 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerDied","Data":"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a"} Mar 11 13:09:12 crc kubenswrapper[4816]: I0311 13:09:12.734415 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerStarted","Data":"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2"} Mar 11 13:09:12 crc kubenswrapper[4816]: I0311 13:09:12.766427 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4vkqz" podStartSLOduration=4.312527904 podStartE2EDuration="6.766405485s" podCreationTimestamp="2026-03-11 13:09:06 +0000 UTC" firstStartedPulling="2026-03-11 13:09:08.670458024 +0000 UTC m=+4235.261722021" lastFinishedPulling="2026-03-11 13:09:11.124335595 +0000 UTC m=+4237.715599602" observedRunningTime="2026-03-11 13:09:12.757948173 +0000 UTC m=+4239.349212150" watchObservedRunningTime="2026-03-11 13:09:12.766405485 +0000 UTC m=+4239.357669462" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.314727 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.315601 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.388387 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.865217 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:17 crc kubenswrapper[4816]: I0311 13:09:17.935563 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:19 crc kubenswrapper[4816]: I0311 13:09:19.805305 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4vkqz" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="registry-server" containerID="cri-o://58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" gracePeriod=2 Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.328016 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.442789 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") pod \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.442936 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") pod \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.443128 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") pod \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\" (UID: \"bc2b0115-a0c2-49f3-b371-41dab6a785d9\") " Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.443994 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities" (OuterVolumeSpecName: "utilities") pod "bc2b0115-a0c2-49f3-b371-41dab6a785d9" (UID: "bc2b0115-a0c2-49f3-b371-41dab6a785d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.453819 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt" (OuterVolumeSpecName: "kube-api-access-p4jnt") pod "bc2b0115-a0c2-49f3-b371-41dab6a785d9" (UID: "bc2b0115-a0c2-49f3-b371-41dab6a785d9"). InnerVolumeSpecName "kube-api-access-p4jnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.496577 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc2b0115-a0c2-49f3-b371-41dab6a785d9" (UID: "bc2b0115-a0c2-49f3-b371-41dab6a785d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.546312 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4jnt\" (UniqueName: \"kubernetes.io/projected/bc2b0115-a0c2-49f3-b371-41dab6a785d9-kube-api-access-p4jnt\") on node \"crc\" DevicePath \"\"" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.546644 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.546859 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2b0115-a0c2-49f3-b371-41dab6a785d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.819436 4816 generic.go:334] "Generic (PLEG): container finished" podID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerID="58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" exitCode=0 Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.819522 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerDied","Data":"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2"} Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.820025 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vkqz" event={"ID":"bc2b0115-a0c2-49f3-b371-41dab6a785d9","Type":"ContainerDied","Data":"dfd5f229efde5cc7964f1d4a40897e75908c215ed7813b1fd48bb7c0b8cb6acb"} Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.820062 4816 scope.go:117] "RemoveContainer" containerID="58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.819642 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vkqz" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.879384 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.881462 4816 scope.go:117] "RemoveContainer" containerID="3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.887842 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vkqz"] Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.910616 4816 scope.go:117] "RemoveContainer" containerID="e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.954988 4816 scope.go:117] "RemoveContainer" containerID="58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" Mar 11 13:09:20 crc kubenswrapper[4816]: E0311 13:09:20.955802 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2\": container with ID starting with 58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2 not found: ID does not exist" containerID="58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.955866 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2"} err="failed to get container status \"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2\": rpc error: code = NotFound desc = could not find container \"58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2\": container with ID starting with 58def574d8980054044e984e5f7d87fe3d19586ad083bc2d90dca2bfdc3dd0a2 not found: ID does not exist" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.955909 4816 scope.go:117] "RemoveContainer" containerID="3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a" Mar 11 13:09:20 crc kubenswrapper[4816]: E0311 13:09:20.956631 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a\": container with ID starting with 3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a not found: ID does not exist" containerID="3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.956741 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a"} err="failed to get container status \"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a\": rpc error: code = NotFound desc = could not find container \"3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a\": container with ID starting with 3df1bf3eb00f84df126d7b442d84242040019a6e3a9832bfefebf0d12258ad6a not found: ID does not exist" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.956771 4816 scope.go:117] "RemoveContainer" containerID="e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7" Mar 11 13:09:20 crc kubenswrapper[4816]: E0311 13:09:20.957406 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7\": container with ID starting with e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7 not found: ID does not exist" containerID="e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7" Mar 11 13:09:20 crc kubenswrapper[4816]: I0311 13:09:20.957619 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7"} err="failed to get container status \"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7\": rpc error: code = NotFound desc = could not find container \"e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7\": container with ID starting with e2897f156738485f6534dccbb57309b212ce8b23650fd9d0fa77ee31641e76d7 not found: ID does not exist" Mar 11 13:09:22 crc kubenswrapper[4816]: I0311 13:09:22.150578 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" path="/var/lib/kubelet/pods/bc2b0115-a0c2-49f3-b371-41dab6a785d9/volumes" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.189463 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:10:00 crc kubenswrapper[4816]: E0311 13:10:00.191132 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="registry-server" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.191157 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="registry-server" Mar 11 13:10:00 crc kubenswrapper[4816]: E0311 13:10:00.191180 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="extract-utilities" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.191195 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="extract-utilities" Mar 11 13:10:00 crc kubenswrapper[4816]: E0311 13:10:00.191217 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="extract-content" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.191227 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="extract-content" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.191668 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2b0115-a0c2-49f3-b371-41dab6a785d9" containerName="registry-server" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.192689 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.197156 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.202984 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.203029 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.204487 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.302879 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") pod \"auto-csr-approver-29553910-xrjfk\" (UID: \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\") " pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.405050 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") pod \"auto-csr-approver-29553910-xrjfk\" (UID: \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\") " pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.443562 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") pod \"auto-csr-approver-29553910-xrjfk\" (UID: \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\") " pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.553062 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:00 crc kubenswrapper[4816]: I0311 13:10:00.847808 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:10:01 crc kubenswrapper[4816]: I0311 13:10:01.240478 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" event={"ID":"4ee2e218-36ee-47c0-9bca-f2f6affd5b02","Type":"ContainerStarted","Data":"e719823490853e7798213dd08945ce63d536796333fa822e405ae33b26d6d66d"} Mar 11 13:10:03 crc kubenswrapper[4816]: I0311 13:10:03.263527 4816 generic.go:334] "Generic (PLEG): container finished" podID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" containerID="13136e90ba59855de085b0d87fba900a964c210d6db5608d7bd773e44d7b1505" exitCode=0 Mar 11 13:10:03 crc kubenswrapper[4816]: I0311 13:10:03.263613 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" event={"ID":"4ee2e218-36ee-47c0-9bca-f2f6affd5b02","Type":"ContainerDied","Data":"13136e90ba59855de085b0d87fba900a964c210d6db5608d7bd773e44d7b1505"} Mar 11 13:10:04 crc kubenswrapper[4816]: I0311 13:10:04.606054 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:04 crc kubenswrapper[4816]: I0311 13:10:04.789674 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") pod \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\" (UID: \"4ee2e218-36ee-47c0-9bca-f2f6affd5b02\") " Mar 11 13:10:04 crc kubenswrapper[4816]: I0311 13:10:04.799314 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr" (OuterVolumeSpecName: "kube-api-access-7kqwr") pod "4ee2e218-36ee-47c0-9bca-f2f6affd5b02" (UID: "4ee2e218-36ee-47c0-9bca-f2f6affd5b02"). InnerVolumeSpecName "kube-api-access-7kqwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:10:04 crc kubenswrapper[4816]: I0311 13:10:04.892767 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kqwr\" (UniqueName: \"kubernetes.io/projected/4ee2e218-36ee-47c0-9bca-f2f6affd5b02-kube-api-access-7kqwr\") on node \"crc\" DevicePath \"\"" Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.286927 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" event={"ID":"4ee2e218-36ee-47c0-9bca-f2f6affd5b02","Type":"ContainerDied","Data":"e719823490853e7798213dd08945ce63d536796333fa822e405ae33b26d6d66d"} Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.287404 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e719823490853e7798213dd08945ce63d536796333fa822e405ae33b26d6d66d" Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.287096 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553910-xrjfk" Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.712920 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:10:05 crc kubenswrapper[4816]: I0311 13:10:05.723521 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553904-2bx7k"] Mar 11 13:10:06 crc kubenswrapper[4816]: I0311 13:10:06.145103 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f176ec9f-47de-4710-a5f3-078403bb4bfb" path="/var/lib/kubelet/pods/f176ec9f-47de-4710-a5f3-078403bb4bfb/volumes" Mar 11 13:10:39 crc kubenswrapper[4816]: I0311 13:10:39.515732 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:10:39 crc kubenswrapper[4816]: I0311 13:10:39.516689 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:10:49 crc kubenswrapper[4816]: I0311 13:10:49.816603 4816 scope.go:117] "RemoveContainer" containerID="c0c120d96d0731c58ebb4a66094eed03724800f299fa6a22258f239a945115e0" Mar 11 13:11:09 crc kubenswrapper[4816]: I0311 13:11:09.515035 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:11:09 crc kubenswrapper[4816]: I0311 13:11:09.515989 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.515759 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.516471 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.516539 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.517153 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 13:11:39 crc kubenswrapper[4816]: I0311 13:11:39.517226 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" gracePeriod=600 Mar 11 13:11:39 crc kubenswrapper[4816]: E0311 13:11:39.663332 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:11:40 crc kubenswrapper[4816]: I0311 13:11:40.205168 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" exitCode=0 Mar 11 13:11:40 crc kubenswrapper[4816]: I0311 13:11:40.205228 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e"} Mar 11 13:11:40 crc kubenswrapper[4816]: I0311 13:11:40.205311 4816 scope.go:117] "RemoveContainer" containerID="93c58f402ba486c6c006c97994ab202bfd22495eff9729c60fbdfcbe918d3c5f" Mar 11 13:11:40 crc kubenswrapper[4816]: I0311 13:11:40.206149 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:11:40 crc kubenswrapper[4816]: E0311 13:11:40.206728 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:11:54 crc kubenswrapper[4816]: I0311 13:11:54.139851 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:11:54 crc kubenswrapper[4816]: E0311 13:11:54.141285 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.178519 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:12:00 crc kubenswrapper[4816]: E0311 13:12:00.179910 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" containerName="oc" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.179936 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" containerName="oc" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.180213 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" containerName="oc" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.181091 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.185105 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") pod \"auto-csr-approver-29553912-dgw9l\" (UID: \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\") " pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.185543 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.189465 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.194652 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.197992 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.287625 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") pod \"auto-csr-approver-29553912-dgw9l\" (UID: \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\") " pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.319606 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") pod \"auto-csr-approver-29553912-dgw9l\" (UID: \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\") " pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.509134 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:00 crc kubenswrapper[4816]: I0311 13:12:00.992213 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:12:01 crc kubenswrapper[4816]: I0311 13:12:01.413544 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" event={"ID":"0cc88fac-43d8-4178-9b36-fc5bd4b04818","Type":"ContainerStarted","Data":"8d920cabe9bb9dc8224b3ba1049bb6821a23c6fcb76f2925fc6fafc3d3fa8a92"} Mar 11 13:12:03 crc kubenswrapper[4816]: I0311 13:12:03.434231 4816 generic.go:334] "Generic (PLEG): container finished" podID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" containerID="148ded4a02efdc34a61cfc1e6b248706834d114bbcd8c2d3fc0a1082e7f112b8" exitCode=0 Mar 11 13:12:03 crc kubenswrapper[4816]: I0311 13:12:03.434342 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" event={"ID":"0cc88fac-43d8-4178-9b36-fc5bd4b04818","Type":"ContainerDied","Data":"148ded4a02efdc34a61cfc1e6b248706834d114bbcd8c2d3fc0a1082e7f112b8"} Mar 11 13:12:04 crc kubenswrapper[4816]: I0311 13:12:04.795346 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:04 crc kubenswrapper[4816]: I0311 13:12:04.975800 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") pod \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\" (UID: \"0cc88fac-43d8-4178-9b36-fc5bd4b04818\") " Mar 11 13:12:04 crc kubenswrapper[4816]: I0311 13:12:04.984365 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr" (OuterVolumeSpecName: "kube-api-access-st8rr") pod "0cc88fac-43d8-4178-9b36-fc5bd4b04818" (UID: "0cc88fac-43d8-4178-9b36-fc5bd4b04818"). InnerVolumeSpecName "kube-api-access-st8rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.077900 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st8rr\" (UniqueName: \"kubernetes.io/projected/0cc88fac-43d8-4178-9b36-fc5bd4b04818-kube-api-access-st8rr\") on node \"crc\" DevicePath \"\"" Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.458293 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" event={"ID":"0cc88fac-43d8-4178-9b36-fc5bd4b04818","Type":"ContainerDied","Data":"8d920cabe9bb9dc8224b3ba1049bb6821a23c6fcb76f2925fc6fafc3d3fa8a92"} Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.458353 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d920cabe9bb9dc8224b3ba1049bb6821a23c6fcb76f2925fc6fafc3d3fa8a92" Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.458400 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553912-dgw9l" Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.892223 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:12:05 crc kubenswrapper[4816]: I0311 13:12:05.898632 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553906-4wxn9"] Mar 11 13:12:06 crc kubenswrapper[4816]: I0311 13:12:06.141025 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf" path="/var/lib/kubelet/pods/4566b1c1-9d53-4e1f-8406-ff9c89aaf8cf/volumes" Mar 11 13:12:09 crc kubenswrapper[4816]: I0311 13:12:09.130496 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:12:09 crc kubenswrapper[4816]: E0311 13:12:09.132972 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:23 crc kubenswrapper[4816]: I0311 13:12:23.131167 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:12:23 crc kubenswrapper[4816]: E0311 13:12:23.132372 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:37 crc kubenswrapper[4816]: I0311 13:12:37.130821 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:12:37 crc kubenswrapper[4816]: E0311 13:12:37.131950 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:48 crc kubenswrapper[4816]: I0311 13:12:48.131929 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:12:48 crc kubenswrapper[4816]: E0311 13:12:48.133151 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:12:49 crc kubenswrapper[4816]: I0311 13:12:49.908217 4816 scope.go:117] "RemoveContainer" containerID="3049a692892071c6574e8ee18347abb47ed4c1ed532d21f9dde8bcb07555460f" Mar 11 13:13:01 crc kubenswrapper[4816]: I0311 13:13:01.131234 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:01 crc kubenswrapper[4816]: E0311 13:13:01.133034 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:13:13 crc kubenswrapper[4816]: I0311 13:13:13.130914 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:13 crc kubenswrapper[4816]: E0311 13:13:13.132176 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:13:26 crc kubenswrapper[4816]: I0311 13:13:26.131534 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:26 crc kubenswrapper[4816]: E0311 13:13:26.132691 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:13:39 crc kubenswrapper[4816]: I0311 13:13:39.130884 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:39 crc kubenswrapper[4816]: E0311 13:13:39.132169 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:13:54 crc kubenswrapper[4816]: I0311 13:13:54.139052 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:13:54 crc kubenswrapper[4816]: E0311 13:13:54.140530 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.176330 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:14:00 crc kubenswrapper[4816]: E0311 13:14:00.177562 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" containerName="oc" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.177579 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" containerName="oc" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.177773 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" containerName="oc" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.178444 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.186392 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.186895 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.187313 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.198156 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.284135 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") pod \"auto-csr-approver-29553914-vtpr2\" (UID: \"32b79556-cf6a-450f-9214-70d0854dc630\") " pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.386650 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") pod \"auto-csr-approver-29553914-vtpr2\" (UID: \"32b79556-cf6a-450f-9214-70d0854dc630\") " pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.433005 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") pod \"auto-csr-approver-29553914-vtpr2\" (UID: \"32b79556-cf6a-450f-9214-70d0854dc630\") " pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:00 crc kubenswrapper[4816]: I0311 13:14:00.521779 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:01 crc kubenswrapper[4816]: I0311 13:14:01.022127 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:14:01 crc kubenswrapper[4816]: I0311 13:14:01.622816 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" event={"ID":"32b79556-cf6a-450f-9214-70d0854dc630","Type":"ContainerStarted","Data":"686ff9ffa6818ed33837cab95d1d4a16f7050819c4a25812e70e0cd6dc4bb0fd"} Mar 11 13:14:03 crc kubenswrapper[4816]: I0311 13:14:03.643950 4816 generic.go:334] "Generic (PLEG): container finished" podID="32b79556-cf6a-450f-9214-70d0854dc630" containerID="8e7758cfa0d68340bf0bfe400a0bcdda434a161dca369cd6a56c8194d33e640d" exitCode=0 Mar 11 13:14:03 crc kubenswrapper[4816]: I0311 13:14:03.644758 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" event={"ID":"32b79556-cf6a-450f-9214-70d0854dc630","Type":"ContainerDied","Data":"8e7758cfa0d68340bf0bfe400a0bcdda434a161dca369cd6a56c8194d33e640d"} Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.114438 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.197837 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") pod \"32b79556-cf6a-450f-9214-70d0854dc630\" (UID: \"32b79556-cf6a-450f-9214-70d0854dc630\") " Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.205495 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt" (OuterVolumeSpecName: "kube-api-access-gtwkt") pod "32b79556-cf6a-450f-9214-70d0854dc630" (UID: "32b79556-cf6a-450f-9214-70d0854dc630"). InnerVolumeSpecName "kube-api-access-gtwkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.300345 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtwkt\" (UniqueName: \"kubernetes.io/projected/32b79556-cf6a-450f-9214-70d0854dc630-kube-api-access-gtwkt\") on node \"crc\" DevicePath \"\"" Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.673879 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" event={"ID":"32b79556-cf6a-450f-9214-70d0854dc630","Type":"ContainerDied","Data":"686ff9ffa6818ed33837cab95d1d4a16f7050819c4a25812e70e0cd6dc4bb0fd"} Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.673935 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="686ff9ffa6818ed33837cab95d1d4a16f7050819c4a25812e70e0cd6dc4bb0fd" Mar 11 13:14:05 crc kubenswrapper[4816]: I0311 13:14:05.673945 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553914-vtpr2" Mar 11 13:14:06 crc kubenswrapper[4816]: I0311 13:14:06.201775 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:14:06 crc kubenswrapper[4816]: I0311 13:14:06.208856 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553908-7sdtd"] Mar 11 13:14:07 crc kubenswrapper[4816]: I0311 13:14:07.130843 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:07 crc kubenswrapper[4816]: E0311 13:14:07.131193 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:08 crc kubenswrapper[4816]: I0311 13:14:08.145982 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc717b0-ae5a-46c4-9fea-48dcab17a9c8" path="/var/lib/kubelet/pods/9dc717b0-ae5a-46c4-9fea-48dcab17a9c8/volumes" Mar 11 13:14:19 crc kubenswrapper[4816]: I0311 13:14:19.131073 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:19 crc kubenswrapper[4816]: E0311 13:14:19.132360 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:30 crc kubenswrapper[4816]: I0311 13:14:30.131052 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:30 crc kubenswrapper[4816]: E0311 13:14:30.132346 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:42 crc kubenswrapper[4816]: I0311 13:14:42.130512 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:42 crc kubenswrapper[4816]: E0311 13:14:42.131646 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:14:50 crc kubenswrapper[4816]: I0311 13:14:50.028666 4816 scope.go:117] "RemoveContainer" containerID="8f956a9fd47ed00082f55e7f0e6d344e63382a72a835af03fd720051a5e8b801" Mar 11 13:14:55 crc kubenswrapper[4816]: I0311 13:14:55.133146 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:14:55 crc kubenswrapper[4816]: E0311 13:14:55.135106 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.188513 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d"] Mar 11 13:15:00 crc kubenswrapper[4816]: E0311 13:15:00.189491 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b79556-cf6a-450f-9214-70d0854dc630" containerName="oc" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.189516 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b79556-cf6a-450f-9214-70d0854dc630" containerName="oc" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.189722 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b79556-cf6a-450f-9214-70d0854dc630" containerName="oc" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.190464 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.192821 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.192931 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.198280 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d"] Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.368085 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.369055 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.384675 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.485999 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.486132 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.486182 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.488039 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.507712 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.509160 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") pod \"collect-profiles-29553915-wvm5d\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:00 crc kubenswrapper[4816]: I0311 13:15:00.559556 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:01 crc kubenswrapper[4816]: I0311 13:15:01.054596 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d"] Mar 11 13:15:01 crc kubenswrapper[4816]: I0311 13:15:01.406320 4816 generic.go:334] "Generic (PLEG): container finished" podID="5bfbb073-59ae-4e0f-9b46-4f27865d35dd" containerID="f6b3186ed4fea575de80650b31d2a10afb3b6a19453228e0dffc9785f41cc4e3" exitCode=0 Mar 11 13:15:01 crc kubenswrapper[4816]: I0311 13:15:01.406379 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" event={"ID":"5bfbb073-59ae-4e0f-9b46-4f27865d35dd","Type":"ContainerDied","Data":"f6b3186ed4fea575de80650b31d2a10afb3b6a19453228e0dffc9785f41cc4e3"} Mar 11 13:15:01 crc kubenswrapper[4816]: I0311 13:15:01.406444 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" event={"ID":"5bfbb073-59ae-4e0f-9b46-4f27865d35dd","Type":"ContainerStarted","Data":"6c8d88f488341419dac3054536326536c7f997ffe41c41a21191b1dcd393ffd5"} Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.447452 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.450618 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.470981 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.617635 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.617692 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.617862 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.718371 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.718420 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.718479 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.719479 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.719514 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.752086 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") pod \"certified-operators-4xkfb\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.797509 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.800692 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.920988 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") pod \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.921073 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") pod \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.921137 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") pod \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\" (UID: \"5bfbb073-59ae-4e0f-9b46-4f27865d35dd\") " Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.923205 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "5bfbb073-59ae-4e0f-9b46-4f27865d35dd" (UID: "5bfbb073-59ae-4e0f-9b46-4f27865d35dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.924198 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5bfbb073-59ae-4e0f-9b46-4f27865d35dd" (UID: "5bfbb073-59ae-4e0f-9b46-4f27865d35dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 13:15:02 crc kubenswrapper[4816]: I0311 13:15:02.924588 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv" (OuterVolumeSpecName: "kube-api-access-vm2lv") pod "5bfbb073-59ae-4e0f-9b46-4f27865d35dd" (UID: "5bfbb073-59ae-4e0f-9b46-4f27865d35dd"). InnerVolumeSpecName "kube-api-access-vm2lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.022724 4816 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.022767 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm2lv\" (UniqueName: \"kubernetes.io/projected/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-kube-api-access-vm2lv\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.022782 4816 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bfbb073-59ae-4e0f-9b46-4f27865d35dd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.039940 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:03 crc kubenswrapper[4816]: W0311 13:15:03.046216 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d9acb13_e6a9_4833_8cfe_3801fd85e2a5.slice/crio-870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54 WatchSource:0}: Error finding container 870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54: Status 404 returned error can't find the container with id 870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54 Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.424741 4816 generic.go:334] "Generic (PLEG): container finished" podID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerID="18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b" exitCode=0 Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.424976 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerDied","Data":"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b"} Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.425004 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerStarted","Data":"870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54"} Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.426530 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.427881 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" event={"ID":"5bfbb073-59ae-4e0f-9b46-4f27865d35dd","Type":"ContainerDied","Data":"6c8d88f488341419dac3054536326536c7f997ffe41c41a21191b1dcd393ffd5"} Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.427909 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c8d88f488341419dac3054536326536c7f997ffe41c41a21191b1dcd393ffd5" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.427945 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553915-wvm5d" Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.922507 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 13:15:03 crc kubenswrapper[4816]: I0311 13:15:03.928888 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553870-n6l8j"] Mar 11 13:15:04 crc kubenswrapper[4816]: I0311 13:15:04.164972 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af7ed2ea-fc1c-4a1a-bf16-50a9817aac81" path="/var/lib/kubelet/pods/af7ed2ea-fc1c-4a1a-bf16-50a9817aac81/volumes" Mar 11 13:15:05 crc kubenswrapper[4816]: I0311 13:15:05.447538 4816 generic.go:334] "Generic (PLEG): container finished" podID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerID="cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9" exitCode=0 Mar 11 13:15:05 crc kubenswrapper[4816]: I0311 13:15:05.447630 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerDied","Data":"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9"} Mar 11 13:15:06 crc kubenswrapper[4816]: I0311 13:15:06.458121 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerStarted","Data":"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf"} Mar 11 13:15:06 crc kubenswrapper[4816]: I0311 13:15:06.501745 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4xkfb" podStartSLOduration=1.9507583560000001 podStartE2EDuration="4.501713162s" podCreationTimestamp="2026-03-11 13:15:02 +0000 UTC" firstStartedPulling="2026-03-11 13:15:03.426322761 +0000 UTC m=+4590.017586728" lastFinishedPulling="2026-03-11 13:15:05.977277537 +0000 UTC m=+4592.568541534" observedRunningTime="2026-03-11 13:15:06.486783785 +0000 UTC m=+4593.078047752" watchObservedRunningTime="2026-03-11 13:15:06.501713162 +0000 UTC m=+4593.092977169" Mar 11 13:15:09 crc kubenswrapper[4816]: I0311 13:15:09.130991 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:15:09 crc kubenswrapper[4816]: E0311 13:15:09.133659 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:15:12 crc kubenswrapper[4816]: I0311 13:15:12.798854 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:12 crc kubenswrapper[4816]: I0311 13:15:12.799452 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:12 crc kubenswrapper[4816]: I0311 13:15:12.889852 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:13 crc kubenswrapper[4816]: I0311 13:15:13.604921 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:13 crc kubenswrapper[4816]: I0311 13:15:13.664443 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:15 crc kubenswrapper[4816]: I0311 13:15:15.549446 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4xkfb" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="registry-server" containerID="cri-o://dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" gracePeriod=2 Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.455162 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.557823 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") pod \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.557898 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") pod \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.557964 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") pod \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\" (UID: \"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5\") " Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.560117 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities" (OuterVolumeSpecName: "utilities") pod "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" (UID: "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.561961 4816 generic.go:334] "Generic (PLEG): container finished" podID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerID="dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" exitCode=0 Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.562025 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerDied","Data":"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf"} Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.562067 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xkfb" event={"ID":"2d9acb13-e6a9-4833-8cfe-3801fd85e2a5","Type":"ContainerDied","Data":"870dbd348776e9d3256ca77f7c640c08bd1a93d1453b571cc8e66d309b98de54"} Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.562090 4816 scope.go:117] "RemoveContainer" containerID="dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.562216 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xkfb" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.567731 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm" (OuterVolumeSpecName: "kube-api-access-w48bm") pod "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" (UID: "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5"). InnerVolumeSpecName "kube-api-access-w48bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.619155 4816 scope.go:117] "RemoveContainer" containerID="cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.629937 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" (UID: "2d9acb13-e6a9-4833-8cfe-3801fd85e2a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.638973 4816 scope.go:117] "RemoveContainer" containerID="18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.660015 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w48bm\" (UniqueName: \"kubernetes.io/projected/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-kube-api-access-w48bm\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.660051 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.660061 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.679761 4816 scope.go:117] "RemoveContainer" containerID="dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" Mar 11 13:15:16 crc kubenswrapper[4816]: E0311 13:15:16.680716 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf\": container with ID starting with dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf not found: ID does not exist" containerID="dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.680816 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf"} err="failed to get container status \"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf\": rpc error: code = NotFound desc = could not find container \"dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf\": container with ID starting with dfb7c1256be8ddf989f914bdb5615fd3a330627f3cfe8c0184c8f30ec02d72cf not found: ID does not exist" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.680874 4816 scope.go:117] "RemoveContainer" containerID="cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9" Mar 11 13:15:16 crc kubenswrapper[4816]: E0311 13:15:16.681351 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9\": container with ID starting with cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9 not found: ID does not exist" containerID="cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.681605 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9"} err="failed to get container status \"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9\": rpc error: code = NotFound desc = could not find container \"cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9\": container with ID starting with cc0f26349c021ed4fc19985f60273ecddcabfeaafbafbeb643a67d00711326f9 not found: ID does not exist" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.681660 4816 scope.go:117] "RemoveContainer" containerID="18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b" Mar 11 13:15:16 crc kubenswrapper[4816]: E0311 13:15:16.682101 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b\": container with ID starting with 18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b not found: ID does not exist" containerID="18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.682172 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b"} err="failed to get container status \"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b\": rpc error: code = NotFound desc = could not find container \"18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b\": container with ID starting with 18faad00734422c09e735a824453d3a0eff856a3fb7f98a4cee7c2b6dfd9470b not found: ID does not exist" Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.896500 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:16 crc kubenswrapper[4816]: I0311 13:15:16.905110 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4xkfb"] Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.146447 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" path="/var/lib/kubelet/pods/2d9acb13-e6a9-4833-8cfe-3801fd85e2a5/volumes" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.218173 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:18 crc kubenswrapper[4816]: E0311 13:15:18.219191 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="registry-server" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219224 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="registry-server" Mar 11 13:15:18 crc kubenswrapper[4816]: E0311 13:15:18.219300 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="extract-content" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219314 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="extract-content" Mar 11 13:15:18 crc kubenswrapper[4816]: E0311 13:15:18.219332 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="extract-utilities" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219346 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="extract-utilities" Mar 11 13:15:18 crc kubenswrapper[4816]: E0311 13:15:18.219372 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfbb073-59ae-4e0f-9b46-4f27865d35dd" containerName="collect-profiles" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219384 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfbb073-59ae-4e0f-9b46-4f27865d35dd" containerName="collect-profiles" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219700 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9acb13-e6a9-4833-8cfe-3801fd85e2a5" containerName="registry-server" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.219742 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfbb073-59ae-4e0f-9b46-4f27865d35dd" containerName="collect-profiles" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.225885 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.243344 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.288727 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.288871 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.289012 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.390977 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.391072 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.391177 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.391596 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.391841 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.416806 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") pod \"redhat-operators-tbw5k\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:18 crc kubenswrapper[4816]: I0311 13:15:18.563967 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:19 crc kubenswrapper[4816]: I0311 13:15:19.087126 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:19 crc kubenswrapper[4816]: I0311 13:15:19.589300 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerID="38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb" exitCode=0 Mar 11 13:15:19 crc kubenswrapper[4816]: I0311 13:15:19.589366 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerDied","Data":"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb"} Mar 11 13:15:19 crc kubenswrapper[4816]: I0311 13:15:19.589403 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerStarted","Data":"464df353d128096eb123c7f056ec9513c5bfab8441350c87f677c6c358b07739"} Mar 11 13:15:21 crc kubenswrapper[4816]: I0311 13:15:21.130635 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:15:21 crc kubenswrapper[4816]: E0311 13:15:21.131272 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:15:21 crc kubenswrapper[4816]: I0311 13:15:21.613538 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerID="18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a" exitCode=0 Mar 11 13:15:21 crc kubenswrapper[4816]: I0311 13:15:21.613607 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerDied","Data":"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a"} Mar 11 13:15:22 crc kubenswrapper[4816]: I0311 13:15:22.626027 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerStarted","Data":"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750"} Mar 11 13:15:22 crc kubenswrapper[4816]: I0311 13:15:22.668621 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tbw5k" podStartSLOduration=2.160828175 podStartE2EDuration="4.668577746s" podCreationTimestamp="2026-03-11 13:15:18 +0000 UTC" firstStartedPulling="2026-03-11 13:15:19.591204768 +0000 UTC m=+4606.182468735" lastFinishedPulling="2026-03-11 13:15:22.098954329 +0000 UTC m=+4608.690218306" observedRunningTime="2026-03-11 13:15:22.655701948 +0000 UTC m=+4609.246965935" watchObservedRunningTime="2026-03-11 13:15:22.668577746 +0000 UTC m=+4609.259841763" Mar 11 13:15:28 crc kubenswrapper[4816]: I0311 13:15:28.564771 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:28 crc kubenswrapper[4816]: I0311 13:15:28.566289 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:29 crc kubenswrapper[4816]: I0311 13:15:29.630153 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbw5k" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" probeResult="failure" output=< Mar 11 13:15:29 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 13:15:29 crc kubenswrapper[4816]: > Mar 11 13:15:36 crc kubenswrapper[4816]: I0311 13:15:36.131302 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:15:36 crc kubenswrapper[4816]: E0311 13:15:36.132131 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:15:38 crc kubenswrapper[4816]: I0311 13:15:38.635703 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:38 crc kubenswrapper[4816]: I0311 13:15:38.702880 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:38 crc kubenswrapper[4816]: I0311 13:15:38.887237 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:39 crc kubenswrapper[4816]: I0311 13:15:39.774818 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tbw5k" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" containerID="cri-o://57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" gracePeriod=2 Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.208011 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.357430 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") pod \"1f414bfb-3cb5-4b0c-a92b-7333284def08\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.357493 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") pod \"1f414bfb-3cb5-4b0c-a92b-7333284def08\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.357628 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") pod \"1f414bfb-3cb5-4b0c-a92b-7333284def08\" (UID: \"1f414bfb-3cb5-4b0c-a92b-7333284def08\") " Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.358543 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities" (OuterVolumeSpecName: "utilities") pod "1f414bfb-3cb5-4b0c-a92b-7333284def08" (UID: "1f414bfb-3cb5-4b0c-a92b-7333284def08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.369677 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6" (OuterVolumeSpecName: "kube-api-access-tnmg6") pod "1f414bfb-3cb5-4b0c-a92b-7333284def08" (UID: "1f414bfb-3cb5-4b0c-a92b-7333284def08"). InnerVolumeSpecName "kube-api-access-tnmg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.459800 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnmg6\" (UniqueName: \"kubernetes.io/projected/1f414bfb-3cb5-4b0c-a92b-7333284def08-kube-api-access-tnmg6\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.459834 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.504997 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f414bfb-3cb5-4b0c-a92b-7333284def08" (UID: "1f414bfb-3cb5-4b0c-a92b-7333284def08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.561312 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f414bfb-3cb5-4b0c-a92b-7333284def08-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786292 4816 generic.go:334] "Generic (PLEG): container finished" podID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerID="57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" exitCode=0 Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786361 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerDied","Data":"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750"} Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786404 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbw5k" event={"ID":"1f414bfb-3cb5-4b0c-a92b-7333284def08","Type":"ContainerDied","Data":"464df353d128096eb123c7f056ec9513c5bfab8441350c87f677c6c358b07739"} Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786432 4816 scope.go:117] "RemoveContainer" containerID="57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.786650 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbw5k" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.815410 4816 scope.go:117] "RemoveContainer" containerID="18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.838752 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.849512 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tbw5k"] Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.867442 4816 scope.go:117] "RemoveContainer" containerID="38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.908415 4816 scope.go:117] "RemoveContainer" containerID="57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" Mar 11 13:15:40 crc kubenswrapper[4816]: E0311 13:15:40.909354 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750\": container with ID starting with 57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750 not found: ID does not exist" containerID="57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.909494 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750"} err="failed to get container status \"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750\": rpc error: code = NotFound desc = could not find container \"57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750\": container with ID starting with 57410d09ea2552fbcb89c29f2789bb81c1c1b32ab7e7ea7ea9fa27e066764750 not found: ID does not exist" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.909539 4816 scope.go:117] "RemoveContainer" containerID="18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a" Mar 11 13:15:40 crc kubenswrapper[4816]: E0311 13:15:40.910107 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a\": container with ID starting with 18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a not found: ID does not exist" containerID="18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.910240 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a"} err="failed to get container status \"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a\": rpc error: code = NotFound desc = could not find container \"18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a\": container with ID starting with 18d80acf5fce19393a180d78cdfbb470440af3691ce9d9fcda49c882505f2e7a not found: ID does not exist" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.910309 4816 scope.go:117] "RemoveContainer" containerID="38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb" Mar 11 13:15:40 crc kubenswrapper[4816]: E0311 13:15:40.910782 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb\": container with ID starting with 38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb not found: ID does not exist" containerID="38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb" Mar 11 13:15:40 crc kubenswrapper[4816]: I0311 13:15:40.910929 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb"} err="failed to get container status \"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb\": rpc error: code = NotFound desc = could not find container \"38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb\": container with ID starting with 38be92a25193d2df70640917dba9f5574c25884773c482af1df25f1c9a24d3bb not found: ID does not exist" Mar 11 13:15:42 crc kubenswrapper[4816]: I0311 13:15:42.141269 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" path="/var/lib/kubelet/pods/1f414bfb-3cb5-4b0c-a92b-7333284def08/volumes" Mar 11 13:15:50 crc kubenswrapper[4816]: I0311 13:15:50.124550 4816 scope.go:117] "RemoveContainer" containerID="df1d35d17e400d5b7e626c6af7307f8e5a96cbf6b1e197b1b3bcbb3209f59864" Mar 11 13:15:51 crc kubenswrapper[4816]: I0311 13:15:51.130776 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:15:51 crc kubenswrapper[4816]: E0311 13:15:51.131337 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.169780 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:16:00 crc kubenswrapper[4816]: E0311 13:16:00.171085 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="extract-content" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.171108 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="extract-content" Mar 11 13:16:00 crc kubenswrapper[4816]: E0311 13:16:00.171128 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.171141 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" Mar 11 13:16:00 crc kubenswrapper[4816]: E0311 13:16:00.171179 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="extract-utilities" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.171192 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="extract-utilities" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.171456 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f414bfb-3cb5-4b0c-a92b-7333284def08" containerName="registry-server" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.172236 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.181142 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.181301 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.181706 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.189859 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") pod \"auto-csr-approver-29553916-n8r8p\" (UID: \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\") " pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.196047 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.294045 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") pod \"auto-csr-approver-29553916-n8r8p\" (UID: \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\") " pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.336301 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") pod \"auto-csr-approver-29553916-n8r8p\" (UID: \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\") " pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:00 crc kubenswrapper[4816]: I0311 13:16:00.528668 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:01 crc kubenswrapper[4816]: I0311 13:16:01.002714 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:16:01 crc kubenswrapper[4816]: I0311 13:16:01.975557 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" event={"ID":"ae7a49f0-ca01-4ad5-a353-5ac125523d95","Type":"ContainerStarted","Data":"7edcf5e255ce59b048a2730a5399b98ab9872184268a4d92c2000b54fbb80e71"} Mar 11 13:16:02 crc kubenswrapper[4816]: I0311 13:16:02.988753 4816 generic.go:334] "Generic (PLEG): container finished" podID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" containerID="e4b94bbef2f14a1e765d933fe579ccf92b49db99e68b93650802fa89e27f09ad" exitCode=0 Mar 11 13:16:02 crc kubenswrapper[4816]: I0311 13:16:02.988910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" event={"ID":"ae7a49f0-ca01-4ad5-a353-5ac125523d95","Type":"ContainerDied","Data":"e4b94bbef2f14a1e765d933fe579ccf92b49db99e68b93650802fa89e27f09ad"} Mar 11 13:16:03 crc kubenswrapper[4816]: I0311 13:16:03.131076 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:16:03 crc kubenswrapper[4816]: E0311 13:16:03.131381 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:16:04 crc kubenswrapper[4816]: I0311 13:16:04.310993 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:04 crc kubenswrapper[4816]: I0311 13:16:04.363145 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") pod \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\" (UID: \"ae7a49f0-ca01-4ad5-a353-5ac125523d95\") " Mar 11 13:16:04 crc kubenswrapper[4816]: I0311 13:16:04.372258 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml" (OuterVolumeSpecName: "kube-api-access-jk2ml") pod "ae7a49f0-ca01-4ad5-a353-5ac125523d95" (UID: "ae7a49f0-ca01-4ad5-a353-5ac125523d95"). InnerVolumeSpecName "kube-api-access-jk2ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:16:04 crc kubenswrapper[4816]: I0311 13:16:04.464406 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk2ml\" (UniqueName: \"kubernetes.io/projected/ae7a49f0-ca01-4ad5-a353-5ac125523d95-kube-api-access-jk2ml\") on node \"crc\" DevicePath \"\"" Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.007691 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" event={"ID":"ae7a49f0-ca01-4ad5-a353-5ac125523d95","Type":"ContainerDied","Data":"7edcf5e255ce59b048a2730a5399b98ab9872184268a4d92c2000b54fbb80e71"} Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.007732 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7edcf5e255ce59b048a2730a5399b98ab9872184268a4d92c2000b54fbb80e71" Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.007764 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553916-n8r8p" Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.406401 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:16:05 crc kubenswrapper[4816]: I0311 13:16:05.416696 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553910-xrjfk"] Mar 11 13:16:06 crc kubenswrapper[4816]: I0311 13:16:06.148769 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee2e218-36ee-47c0-9bca-f2f6affd5b02" path="/var/lib/kubelet/pods/4ee2e218-36ee-47c0-9bca-f2f6affd5b02/volumes" Mar 11 13:16:18 crc kubenswrapper[4816]: I0311 13:16:18.131397 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:16:18 crc kubenswrapper[4816]: E0311 13:16:18.133019 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:16:31 crc kubenswrapper[4816]: I0311 13:16:31.130152 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:16:31 crc kubenswrapper[4816]: E0311 13:16:31.131295 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:16:43 crc kubenswrapper[4816]: I0311 13:16:43.130634 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:16:44 crc kubenswrapper[4816]: I0311 13:16:44.405644 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4"} Mar 11 13:16:50 crc kubenswrapper[4816]: I0311 13:16:50.230966 4816 scope.go:117] "RemoveContainer" containerID="13136e90ba59855de085b0d87fba900a964c210d6db5608d7bd773e44d7b1505" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.173979 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:18:00 crc kubenswrapper[4816]: E0311 13:18:00.175427 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" containerName="oc" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.175463 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" containerName="oc" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.175776 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" containerName="oc" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.176810 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.179709 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.180864 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.181520 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.183952 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.302926 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") pod \"auto-csr-approver-29553918-7rzs7\" (UID: \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\") " pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.405149 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") pod \"auto-csr-approver-29553918-7rzs7\" (UID: \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\") " pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.432005 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") pod \"auto-csr-approver-29553918-7rzs7\" (UID: \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\") " pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:00 crc kubenswrapper[4816]: I0311 13:18:00.505003 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:01 crc kubenswrapper[4816]: I0311 13:18:01.017635 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:18:01 crc kubenswrapper[4816]: I0311 13:18:01.175454 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" event={"ID":"d95b7b2b-acc3-47bd-b762-29e39ca68f93","Type":"ContainerStarted","Data":"4b6c1548264a5a5148c9addf189154139bb8f86400a8b334ab4e20ba6beb974e"} Mar 11 13:18:03 crc kubenswrapper[4816]: I0311 13:18:03.197302 4816 generic.go:334] "Generic (PLEG): container finished" podID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" containerID="740acfe6fc04d23ba8749fd0de9541e5bd0ee02db427a2bd65a7b93925e05ec4" exitCode=0 Mar 11 13:18:03 crc kubenswrapper[4816]: I0311 13:18:03.197418 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" event={"ID":"d95b7b2b-acc3-47bd-b762-29e39ca68f93","Type":"ContainerDied","Data":"740acfe6fc04d23ba8749fd0de9541e5bd0ee02db427a2bd65a7b93925e05ec4"} Mar 11 13:18:04 crc kubenswrapper[4816]: I0311 13:18:04.578981 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:04 crc kubenswrapper[4816]: I0311 13:18:04.776019 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") pod \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\" (UID: \"d95b7b2b-acc3-47bd-b762-29e39ca68f93\") " Mar 11 13:18:04 crc kubenswrapper[4816]: I0311 13:18:04.783662 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx" (OuterVolumeSpecName: "kube-api-access-f7mbx") pod "d95b7b2b-acc3-47bd-b762-29e39ca68f93" (UID: "d95b7b2b-acc3-47bd-b762-29e39ca68f93"). InnerVolumeSpecName "kube-api-access-f7mbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:18:04 crc kubenswrapper[4816]: I0311 13:18:04.878294 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7mbx\" (UniqueName: \"kubernetes.io/projected/d95b7b2b-acc3-47bd-b762-29e39ca68f93-kube-api-access-f7mbx\") on node \"crc\" DevicePath \"\"" Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.219067 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" event={"ID":"d95b7b2b-acc3-47bd-b762-29e39ca68f93","Type":"ContainerDied","Data":"4b6c1548264a5a5148c9addf189154139bb8f86400a8b334ab4e20ba6beb974e"} Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.219134 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b6c1548264a5a5148c9addf189154139bb8f86400a8b334ab4e20ba6beb974e" Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.219164 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553918-7rzs7" Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.689475 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:18:05 crc kubenswrapper[4816]: I0311 13:18:05.693556 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553912-dgw9l"] Mar 11 13:18:06 crc kubenswrapper[4816]: I0311 13:18:06.148077 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc88fac-43d8-4178-9b36-fc5bd4b04818" path="/var/lib/kubelet/pods/0cc88fac-43d8-4178-9b36-fc5bd4b04818/volumes" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.588080 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2l2xd"] Mar 11 13:18:20 crc kubenswrapper[4816]: E0311 13:18:20.591672 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" containerName="oc" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.591876 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" containerName="oc" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.592328 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" containerName="oc" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.594587 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.607065 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2l2xd"] Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.691684 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-catalog-content\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.691746 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-utilities\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.691779 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2l74\" (UniqueName: \"kubernetes.io/projected/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-kube-api-access-j2l74\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.792472 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-catalog-content\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.792860 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-utilities\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.793029 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2l74\" (UniqueName: \"kubernetes.io/projected/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-kube-api-access-j2l74\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.793224 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-catalog-content\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.793459 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-utilities\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.819686 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2l74\" (UniqueName: \"kubernetes.io/projected/a13e0873-9c6c-46d7-b0bf-4ef50c40a918-kube-api-access-j2l74\") pod \"community-operators-2l2xd\" (UID: \"a13e0873-9c6c-46d7-b0bf-4ef50c40a918\") " pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:20 crc kubenswrapper[4816]: I0311 13:18:20.932412 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:21 crc kubenswrapper[4816]: I0311 13:18:21.478304 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2l2xd"] Mar 11 13:18:22 crc kubenswrapper[4816]: I0311 13:18:22.386443 4816 generic.go:334] "Generic (PLEG): container finished" podID="a13e0873-9c6c-46d7-b0bf-4ef50c40a918" containerID="283b0a0d57fd31208c4a0ce99b6815b3e12f2f8da3c601f5ae8cc6bcdd24a22b" exitCode=0 Mar 11 13:18:22 crc kubenswrapper[4816]: I0311 13:18:22.386511 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2l2xd" event={"ID":"a13e0873-9c6c-46d7-b0bf-4ef50c40a918","Type":"ContainerDied","Data":"283b0a0d57fd31208c4a0ce99b6815b3e12f2f8da3c601f5ae8cc6bcdd24a22b"} Mar 11 13:18:22 crc kubenswrapper[4816]: I0311 13:18:22.386759 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2l2xd" event={"ID":"a13e0873-9c6c-46d7-b0bf-4ef50c40a918","Type":"ContainerStarted","Data":"90e3952219eafce29019495f39895e1b333e7da922f783f7118925b2341c1489"} Mar 11 13:18:27 crc kubenswrapper[4816]: I0311 13:18:27.423572 4816 generic.go:334] "Generic (PLEG): container finished" podID="a13e0873-9c6c-46d7-b0bf-4ef50c40a918" containerID="5db91ea8179782dc441b36bffafd7a6d8f6d49078d5ae0b5acab10829f852071" exitCode=0 Mar 11 13:18:27 crc kubenswrapper[4816]: I0311 13:18:27.424179 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2l2xd" event={"ID":"a13e0873-9c6c-46d7-b0bf-4ef50c40a918","Type":"ContainerDied","Data":"5db91ea8179782dc441b36bffafd7a6d8f6d49078d5ae0b5acab10829f852071"} Mar 11 13:18:29 crc kubenswrapper[4816]: I0311 13:18:29.450874 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2l2xd" event={"ID":"a13e0873-9c6c-46d7-b0bf-4ef50c40a918","Type":"ContainerStarted","Data":"6c9b8a20bb7fcfdf0014b6b92108f77b550658cc551ad24c0d410cf73f181bed"} Mar 11 13:18:29 crc kubenswrapper[4816]: I0311 13:18:29.472224 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2l2xd" podStartSLOduration=3.071756606 podStartE2EDuration="9.472198267s" podCreationTimestamp="2026-03-11 13:18:20 +0000 UTC" firstStartedPulling="2026-03-11 13:18:22.388174401 +0000 UTC m=+4788.979438368" lastFinishedPulling="2026-03-11 13:18:28.788616062 +0000 UTC m=+4795.379880029" observedRunningTime="2026-03-11 13:18:29.466514615 +0000 UTC m=+4796.057778592" watchObservedRunningTime="2026-03-11 13:18:29.472198267 +0000 UTC m=+4796.063462234" Mar 11 13:18:30 crc kubenswrapper[4816]: I0311 13:18:30.932902 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:30 crc kubenswrapper[4816]: I0311 13:18:30.933370 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:32 crc kubenswrapper[4816]: I0311 13:18:32.002740 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2l2xd" podUID="a13e0873-9c6c-46d7-b0bf-4ef50c40a918" containerName="registry-server" probeResult="failure" output=< Mar 11 13:18:32 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 13:18:32 crc kubenswrapper[4816]: > Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.008170 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.076091 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2l2xd" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.161686 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2l2xd"] Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.263024 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.263440 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qb5pd" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="registry-server" containerID="cri-o://df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336" gracePeriod=2 Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.569544 4816 generic.go:334] "Generic (PLEG): container finished" podID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerID="df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336" exitCode=0 Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.569633 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerDied","Data":"df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336"} Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.672842 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.874110 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") pod \"963d27c0-f203-4997-aa60-ac73d2a54cc0\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.874433 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") pod \"963d27c0-f203-4997-aa60-ac73d2a54cc0\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.874495 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") pod \"963d27c0-f203-4997-aa60-ac73d2a54cc0\" (UID: \"963d27c0-f203-4997-aa60-ac73d2a54cc0\") " Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.874964 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities" (OuterVolumeSpecName: "utilities") pod "963d27c0-f203-4997-aa60-ac73d2a54cc0" (UID: "963d27c0-f203-4997-aa60-ac73d2a54cc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.886993 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj" (OuterVolumeSpecName: "kube-api-access-f44xj") pod "963d27c0-f203-4997-aa60-ac73d2a54cc0" (UID: "963d27c0-f203-4997-aa60-ac73d2a54cc0"). InnerVolumeSpecName "kube-api-access-f44xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.940306 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "963d27c0-f203-4997-aa60-ac73d2a54cc0" (UID: "963d27c0-f203-4997-aa60-ac73d2a54cc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.975526 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.975566 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d27c0-f203-4997-aa60-ac73d2a54cc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:18:41 crc kubenswrapper[4816]: I0311 13:18:41.975578 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f44xj\" (UniqueName: \"kubernetes.io/projected/963d27c0-f203-4997-aa60-ac73d2a54cc0-kube-api-access-f44xj\") on node \"crc\" DevicePath \"\"" Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.582534 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb5pd" Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.582531 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb5pd" event={"ID":"963d27c0-f203-4997-aa60-ac73d2a54cc0","Type":"ContainerDied","Data":"5d89076f0fdd1a586d2d1d9d12f836502df9b389006d64897e25f5fabea5fa22"} Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.582610 4816 scope.go:117] "RemoveContainer" containerID="df036ebc1022629bd7df15b57ae8610b239015cc838a88645d459c84c864e336" Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.610730 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.623679 4816 scope.go:117] "RemoveContainer" containerID="8c235b1052133359e398ac00a2eee490f7a085338a2901f71eac6e872bda6cbf" Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.628357 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qb5pd"] Mar 11 13:18:42 crc kubenswrapper[4816]: I0311 13:18:42.663738 4816 scope.go:117] "RemoveContainer" containerID="6e58f19a27ae3010beb47e8be328d7c7ee7c8f14b5f34d2213706b6f25097290" Mar 11 13:18:44 crc kubenswrapper[4816]: I0311 13:18:44.147048 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" path="/var/lib/kubelet/pods/963d27c0-f203-4997-aa60-ac73d2a54cc0/volumes" Mar 11 13:18:50 crc kubenswrapper[4816]: I0311 13:18:50.367549 4816 scope.go:117] "RemoveContainer" containerID="148ded4a02efdc34a61cfc1e6b248706834d114bbcd8c2d3fc0a1082e7f112b8" Mar 11 13:19:09 crc kubenswrapper[4816]: I0311 13:19:09.515448 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:19:09 crc kubenswrapper[4816]: I0311 13:19:09.516111 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:19:23 crc kubenswrapper[4816]: I0311 13:19:23.992892 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.005173 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-69hv5"] Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.143993 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7462073d-1852-4032-87bc-e0a4b973f92f" path="/var/lib/kubelet/pods/7462073d-1852-4032-87bc-e0a4b973f92f/volumes" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.152978 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:24 crc kubenswrapper[4816]: E0311 13:19:24.158699 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="registry-server" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.158727 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="registry-server" Mar 11 13:19:24 crc kubenswrapper[4816]: E0311 13:19:24.158741 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="extract-utilities" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.158749 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="extract-utilities" Mar 11 13:19:24 crc kubenswrapper[4816]: E0311 13:19:24.158763 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="extract-content" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.158772 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="extract-content" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.158988 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="963d27c0-f203-4997-aa60-ac73d2a54cc0" containerName="registry-server" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.160105 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.160198 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.171287 4816 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-zmgc9" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.171375 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.171432 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.171434 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.251841 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.251895 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.251930 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.354117 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.354766 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.354928 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.355362 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.355636 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.391287 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") pod \"crc-storage-crc-gjqlz\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.487197 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:24 crc kubenswrapper[4816]: I0311 13:19:24.779234 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:24 crc kubenswrapper[4816]: W0311 13:19:24.782001 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fdf8d9b_a464_4d2f_a2e5_a4854b7f9ab3.slice/crio-e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124 WatchSource:0}: Error finding container e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124: Status 404 returned error can't find the container with id e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124 Mar 11 13:19:25 crc kubenswrapper[4816]: I0311 13:19:25.034480 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gjqlz" event={"ID":"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3","Type":"ContainerStarted","Data":"e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124"} Mar 11 13:19:26 crc kubenswrapper[4816]: I0311 13:19:26.046156 4816 generic.go:334] "Generic (PLEG): container finished" podID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" containerID="3c47893cabbfc635edaea2ea48266ffc815a61e1e094878326c38fe6119ee1b9" exitCode=0 Mar 11 13:19:26 crc kubenswrapper[4816]: I0311 13:19:26.046237 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gjqlz" event={"ID":"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3","Type":"ContainerDied","Data":"3c47893cabbfc635edaea2ea48266ffc815a61e1e094878326c38fe6119ee1b9"} Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.420479 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.527470 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") pod \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.527739 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") pod \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.527844 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" (UID: "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.528027 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") pod \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\" (UID: \"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3\") " Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.528456 4816 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.536913 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb" (OuterVolumeSpecName: "kube-api-access-7dqpb") pod "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" (UID: "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3"). InnerVolumeSpecName "kube-api-access-7dqpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.553708 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" (UID: "3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.629828 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dqpb\" (UniqueName: \"kubernetes.io/projected/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-kube-api-access-7dqpb\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:27 crc kubenswrapper[4816]: I0311 13:19:27.629872 4816 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:28 crc kubenswrapper[4816]: I0311 13:19:28.068638 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gjqlz" event={"ID":"3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3","Type":"ContainerDied","Data":"e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124"} Mar 11 13:19:28 crc kubenswrapper[4816]: I0311 13:19:28.069128 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9fc92572a812a6a0a93fb11d768dee3bd44fc98e9449d2ffae9276055dc4124" Mar 11 13:19:28 crc kubenswrapper[4816]: I0311 13:19:28.068815 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gjqlz" Mar 11 13:19:29 crc kubenswrapper[4816]: I0311 13:19:29.986775 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:29 crc kubenswrapper[4816]: I0311 13:19:29.997206 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-gjqlz"] Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.174808 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" path="/var/lib/kubelet/pods/3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3/volumes" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.176774 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-mjnc9"] Mar 11 13:19:30 crc kubenswrapper[4816]: E0311 13:19:30.178872 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" containerName="storage" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.178917 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" containerName="storage" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.179534 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fdf8d9b-a464-4d2f-a2e5-a4854b7f9ab3" containerName="storage" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.180746 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.184981 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.185741 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.187631 4816 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-zmgc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.187649 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.193100 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mjnc9"] Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.278491 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.278577 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.278613 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.380638 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.381404 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.381642 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.381304 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.382720 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.419333 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") pod \"crc-storage-crc-mjnc9\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.512433 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:30 crc kubenswrapper[4816]: I0311 13:19:30.803570 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mjnc9"] Mar 11 13:19:31 crc kubenswrapper[4816]: I0311 13:19:31.098698 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mjnc9" event={"ID":"d30e4b47-6db7-45ec-b6e8-22a9e619d462","Type":"ContainerStarted","Data":"a16f89b6d7d619cf99a106f26780c963377fc7bd5ef58748edc7d4f4741cc356"} Mar 11 13:19:32 crc kubenswrapper[4816]: I0311 13:19:32.108822 4816 generic.go:334] "Generic (PLEG): container finished" podID="d30e4b47-6db7-45ec-b6e8-22a9e619d462" containerID="d83f72f91a08d6b75ba22e2ebf0fc9900a5fbe8d91a5c626eec467d809c24f71" exitCode=0 Mar 11 13:19:32 crc kubenswrapper[4816]: I0311 13:19:32.108963 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mjnc9" event={"ID":"d30e4b47-6db7-45ec-b6e8-22a9e619d462","Type":"ContainerDied","Data":"d83f72f91a08d6b75ba22e2ebf0fc9900a5fbe8d91a5c626eec467d809c24f71"} Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.563324 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633240 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") pod \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633417 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") pod \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633499 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") pod \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\" (UID: \"d30e4b47-6db7-45ec-b6e8-22a9e619d462\") " Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633679 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d30e4b47-6db7-45ec-b6e8-22a9e619d462" (UID: "d30e4b47-6db7-45ec-b6e8-22a9e619d462"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.633931 4816 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d30e4b47-6db7-45ec-b6e8-22a9e619d462-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.640422 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz" (OuterVolumeSpecName: "kube-api-access-q5jdz") pod "d30e4b47-6db7-45ec-b6e8-22a9e619d462" (UID: "d30e4b47-6db7-45ec-b6e8-22a9e619d462"). InnerVolumeSpecName "kube-api-access-q5jdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.658578 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d30e4b47-6db7-45ec-b6e8-22a9e619d462" (UID: "d30e4b47-6db7-45ec-b6e8-22a9e619d462"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.735372 4816 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d30e4b47-6db7-45ec-b6e8-22a9e619d462-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:33 crc kubenswrapper[4816]: I0311 13:19:33.735428 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5jdz\" (UniqueName: \"kubernetes.io/projected/d30e4b47-6db7-45ec-b6e8-22a9e619d462-kube-api-access-q5jdz\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:34 crc kubenswrapper[4816]: I0311 13:19:34.142022 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mjnc9" Mar 11 13:19:34 crc kubenswrapper[4816]: I0311 13:19:34.176143 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mjnc9" event={"ID":"d30e4b47-6db7-45ec-b6e8-22a9e619d462","Type":"ContainerDied","Data":"a16f89b6d7d619cf99a106f26780c963377fc7bd5ef58748edc7d4f4741cc356"} Mar 11 13:19:34 crc kubenswrapper[4816]: I0311 13:19:34.176199 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a16f89b6d7d619cf99a106f26780c963377fc7bd5ef58748edc7d4f4741cc356" Mar 11 13:19:39 crc kubenswrapper[4816]: I0311 13:19:39.515140 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:19:39 crc kubenswrapper[4816]: I0311 13:19:39.515943 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.425497 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:45 crc kubenswrapper[4816]: E0311 13:19:45.433703 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30e4b47-6db7-45ec-b6e8-22a9e619d462" containerName="storage" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.433735 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30e4b47-6db7-45ec-b6e8-22a9e619d462" containerName="storage" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.433989 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30e4b47-6db7-45ec-b6e8-22a9e619d462" containerName="storage" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.435509 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.447825 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.568021 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.568167 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.568209 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.669592 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.669679 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.669771 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.670435 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.670509 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.696089 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") pod \"redhat-marketplace-dlh8d\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:45 crc kubenswrapper[4816]: I0311 13:19:45.814049 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:46 crc kubenswrapper[4816]: I0311 13:19:46.071224 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:46 crc kubenswrapper[4816]: I0311 13:19:46.247573 4816 generic.go:334] "Generic (PLEG): container finished" podID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerID="8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741" exitCode=0 Mar 11 13:19:46 crc kubenswrapper[4816]: I0311 13:19:46.247622 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerDied","Data":"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741"} Mar 11 13:19:46 crc kubenswrapper[4816]: I0311 13:19:46.247677 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerStarted","Data":"5fd9a757141accae25718c44ebcf77b36ba1f105d8089b8809bf7ab1041950c1"} Mar 11 13:19:48 crc kubenswrapper[4816]: I0311 13:19:48.268316 4816 generic.go:334] "Generic (PLEG): container finished" podID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerID="b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599" exitCode=0 Mar 11 13:19:48 crc kubenswrapper[4816]: I0311 13:19:48.268402 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerDied","Data":"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599"} Mar 11 13:19:49 crc kubenswrapper[4816]: I0311 13:19:49.280130 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerStarted","Data":"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021"} Mar 11 13:19:49 crc kubenswrapper[4816]: I0311 13:19:49.305002 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dlh8d" podStartSLOduration=1.695181204 podStartE2EDuration="4.304975043s" podCreationTimestamp="2026-03-11 13:19:45 +0000 UTC" firstStartedPulling="2026-03-11 13:19:46.249130781 +0000 UTC m=+4872.840394748" lastFinishedPulling="2026-03-11 13:19:48.85892457 +0000 UTC m=+4875.450188587" observedRunningTime="2026-03-11 13:19:49.30313067 +0000 UTC m=+4875.894394667" watchObservedRunningTime="2026-03-11 13:19:49.304975043 +0000 UTC m=+4875.896239030" Mar 11 13:19:50 crc kubenswrapper[4816]: I0311 13:19:50.471238 4816 scope.go:117] "RemoveContainer" containerID="0ee4f053b0c8963adb31e4e6ffaf9c7c100dafccbfa493c26f5254141c13917c" Mar 11 13:19:55 crc kubenswrapper[4816]: I0311 13:19:55.814597 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:55 crc kubenswrapper[4816]: I0311 13:19:55.816791 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:55 crc kubenswrapper[4816]: I0311 13:19:55.876308 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:56 crc kubenswrapper[4816]: I0311 13:19:56.391388 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:56 crc kubenswrapper[4816]: I0311 13:19:56.454226 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:58 crc kubenswrapper[4816]: I0311 13:19:58.355762 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dlh8d" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="registry-server" containerID="cri-o://7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" gracePeriod=2 Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.275180 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.367625 4816 generic.go:334] "Generic (PLEG): container finished" podID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerID="7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" exitCode=0 Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.367677 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dlh8d" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.367720 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerDied","Data":"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021"} Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.368518 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dlh8d" event={"ID":"c131fc25-0347-4783-bb0e-51d87ef555ea","Type":"ContainerDied","Data":"5fd9a757141accae25718c44ebcf77b36ba1f105d8089b8809bf7ab1041950c1"} Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.368565 4816 scope.go:117] "RemoveContainer" containerID="7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.393591 4816 scope.go:117] "RemoveContainer" containerID="b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.396991 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") pod \"c131fc25-0347-4783-bb0e-51d87ef555ea\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.397105 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") pod \"c131fc25-0347-4783-bb0e-51d87ef555ea\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.397131 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") pod \"c131fc25-0347-4783-bb0e-51d87ef555ea\" (UID: \"c131fc25-0347-4783-bb0e-51d87ef555ea\") " Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.398314 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities" (OuterVolumeSpecName: "utilities") pod "c131fc25-0347-4783-bb0e-51d87ef555ea" (UID: "c131fc25-0347-4783-bb0e-51d87ef555ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.404771 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc" (OuterVolumeSpecName: "kube-api-access-s6knc") pod "c131fc25-0347-4783-bb0e-51d87ef555ea" (UID: "c131fc25-0347-4783-bb0e-51d87ef555ea"). InnerVolumeSpecName "kube-api-access-s6knc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.424045 4816 scope.go:117] "RemoveContainer" containerID="8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.437295 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c131fc25-0347-4783-bb0e-51d87ef555ea" (UID: "c131fc25-0347-4783-bb0e-51d87ef555ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.461025 4816 scope.go:117] "RemoveContainer" containerID="7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" Mar 11 13:19:59 crc kubenswrapper[4816]: E0311 13:19:59.461827 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021\": container with ID starting with 7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021 not found: ID does not exist" containerID="7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.461880 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021"} err="failed to get container status \"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021\": rpc error: code = NotFound desc = could not find container \"7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021\": container with ID starting with 7fe8c0215bd2b054a118731083714c88ac6f6daea94e7b466940b02b163a8021 not found: ID does not exist" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.461907 4816 scope.go:117] "RemoveContainer" containerID="b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599" Mar 11 13:19:59 crc kubenswrapper[4816]: E0311 13:19:59.462388 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599\": container with ID starting with b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599 not found: ID does not exist" containerID="b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.462587 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599"} err="failed to get container status \"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599\": rpc error: code = NotFound desc = could not find container \"b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599\": container with ID starting with b8e1e49f1243bdfe5a560ce329e08c61db4b0da9915a68ba0d6eb2188a2d3599 not found: ID does not exist" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.462623 4816 scope.go:117] "RemoveContainer" containerID="8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741" Mar 11 13:19:59 crc kubenswrapper[4816]: E0311 13:19:59.463019 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741\": container with ID starting with 8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741 not found: ID does not exist" containerID="8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.463073 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741"} err="failed to get container status \"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741\": rpc error: code = NotFound desc = could not find container \"8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741\": container with ID starting with 8dada37ecd1c3c09c55e467ba07b0c56cbf5c669183b649851bded5da283a741 not found: ID does not exist" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.498863 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.498894 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6knc\" (UniqueName: \"kubernetes.io/projected/c131fc25-0347-4783-bb0e-51d87ef555ea-kube-api-access-s6knc\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.498905 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c131fc25-0347-4783-bb0e-51d87ef555ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.705351 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:19:59 crc kubenswrapper[4816]: I0311 13:19:59.711634 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dlh8d"] Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.147391 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" path="/var/lib/kubelet/pods/c131fc25-0347-4783-bb0e-51d87ef555ea/volumes" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.148581 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:20:00 crc kubenswrapper[4816]: E0311 13:20:00.148979 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="extract-utilities" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.149009 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="extract-utilities" Mar 11 13:20:00 crc kubenswrapper[4816]: E0311 13:20:00.149041 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="registry-server" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.149054 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="registry-server" Mar 11 13:20:00 crc kubenswrapper[4816]: E0311 13:20:00.149078 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="extract-content" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.149090 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="extract-content" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.149740 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c131fc25-0347-4783-bb0e-51d87ef555ea" containerName="registry-server" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.150432 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.150552 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.152858 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.153079 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.155272 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.312340 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") pod \"auto-csr-approver-29553920-h6hh5\" (UID: \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\") " pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.414128 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") pod \"auto-csr-approver-29553920-h6hh5\" (UID: \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\") " pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.432611 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") pod \"auto-csr-approver-29553920-h6hh5\" (UID: \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\") " pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.471722 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:00 crc kubenswrapper[4816]: I0311 13:20:00.765476 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:20:00 crc kubenswrapper[4816]: W0311 13:20:00.766529 4816 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod193e1468_2f5b_4e66_94f3_a7fc184c7e01.slice/crio-21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee WatchSource:0}: Error finding container 21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee: Status 404 returned error can't find the container with id 21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee Mar 11 13:20:01 crc kubenswrapper[4816]: I0311 13:20:01.384698 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" event={"ID":"193e1468-2f5b-4e66-94f3-a7fc184c7e01","Type":"ContainerStarted","Data":"21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee"} Mar 11 13:20:03 crc kubenswrapper[4816]: I0311 13:20:03.403311 4816 generic.go:334] "Generic (PLEG): container finished" podID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" containerID="4a3cade9d3e8a7bb5a9e71032c96de36c1178b4f7d16ddbd543510f67b6be155" exitCode=0 Mar 11 13:20:03 crc kubenswrapper[4816]: I0311 13:20:03.403442 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" event={"ID":"193e1468-2f5b-4e66-94f3-a7fc184c7e01","Type":"ContainerDied","Data":"4a3cade9d3e8a7bb5a9e71032c96de36c1178b4f7d16ddbd543510f67b6be155"} Mar 11 13:20:04 crc kubenswrapper[4816]: I0311 13:20:04.762569 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:04 crc kubenswrapper[4816]: I0311 13:20:04.887126 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") pod \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\" (UID: \"193e1468-2f5b-4e66-94f3-a7fc184c7e01\") " Mar 11 13:20:04 crc kubenswrapper[4816]: I0311 13:20:04.896349 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9" (OuterVolumeSpecName: "kube-api-access-psxg9") pod "193e1468-2f5b-4e66-94f3-a7fc184c7e01" (UID: "193e1468-2f5b-4e66-94f3-a7fc184c7e01"). InnerVolumeSpecName "kube-api-access-psxg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:20:04 crc kubenswrapper[4816]: I0311 13:20:04.989176 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psxg9\" (UniqueName: \"kubernetes.io/projected/193e1468-2f5b-4e66-94f3-a7fc184c7e01-kube-api-access-psxg9\") on node \"crc\" DevicePath \"\"" Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.427411 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" event={"ID":"193e1468-2f5b-4e66-94f3-a7fc184c7e01","Type":"ContainerDied","Data":"21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee"} Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.427507 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21a46c47a0bed8bd38e19f517c3c8cd2030d7463b8e1e69fa397c74be15882ee" Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.427520 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553920-h6hh5" Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.856022 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:20:05 crc kubenswrapper[4816]: I0311 13:20:05.862592 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553914-vtpr2"] Mar 11 13:20:06 crc kubenswrapper[4816]: I0311 13:20:06.146919 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b79556-cf6a-450f-9214-70d0854dc630" path="/var/lib/kubelet/pods/32b79556-cf6a-450f-9214-70d0854dc630/volumes" Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.514932 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.515676 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.515767 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.516874 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 13:20:09 crc kubenswrapper[4816]: I0311 13:20:09.516986 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4" gracePeriod=600 Mar 11 13:20:10 crc kubenswrapper[4816]: I0311 13:20:10.473286 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4" exitCode=0 Mar 11 13:20:10 crc kubenswrapper[4816]: I0311 13:20:10.473326 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4"} Mar 11 13:20:10 crc kubenswrapper[4816]: I0311 13:20:10.473356 4816 scope.go:117] "RemoveContainer" containerID="feffe5b2dfe848c315cf8be61701a4665673828a287a584d38912d4142b6653e" Mar 11 13:20:11 crc kubenswrapper[4816]: I0311 13:20:11.483608 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911"} Mar 11 13:20:50 crc kubenswrapper[4816]: I0311 13:20:50.541886 4816 scope.go:117] "RemoveContainer" containerID="8e7758cfa0d68340bf0bfe400a0bcdda434a161dca369cd6a56c8194d33e640d" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.170896 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:22:00 crc kubenswrapper[4816]: E0311 13:22:00.173441 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" containerName="oc" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.173474 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" containerName="oc" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.173776 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" containerName="oc" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.174541 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.177082 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.178041 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.179411 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.185034 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.343479 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") pod \"auto-csr-approver-29553922-l2chb\" (UID: \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\") " pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.444889 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") pod \"auto-csr-approver-29553922-l2chb\" (UID: \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\") " pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.475359 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") pod \"auto-csr-approver-29553922-l2chb\" (UID: \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\") " pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.504342 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.777581 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:22:00 crc kubenswrapper[4816]: I0311 13:22:00.790414 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:22:01 crc kubenswrapper[4816]: I0311 13:22:01.481755 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553922-l2chb" event={"ID":"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02","Type":"ContainerStarted","Data":"9cef72375d217d58a4f49a8becffa8fc99ece346d358c56e97b42d8d61f1a2d3"} Mar 11 13:22:02 crc kubenswrapper[4816]: I0311 13:22:02.494914 4816 generic.go:334] "Generic (PLEG): container finished" podID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" containerID="5610d0163da9f92dcf1f4addb326b68bb7bee62775e25ffcf227b46aacd6327b" exitCode=0 Mar 11 13:22:02 crc kubenswrapper[4816]: I0311 13:22:02.495029 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553922-l2chb" event={"ID":"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02","Type":"ContainerDied","Data":"5610d0163da9f92dcf1f4addb326b68bb7bee62775e25ffcf227b46aacd6327b"} Mar 11 13:22:03 crc kubenswrapper[4816]: I0311 13:22:03.999370 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.096620 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") pod \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\" (UID: \"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02\") " Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.104797 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74" (OuterVolumeSpecName: "kube-api-access-dkx74") pod "a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" (UID: "a09c6fad-26f7-4ea2-84fc-5d2efb86fd02"). InnerVolumeSpecName "kube-api-access-dkx74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.198794 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkx74\" (UniqueName: \"kubernetes.io/projected/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02-kube-api-access-dkx74\") on node \"crc\" DevicePath \"\"" Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.521740 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553922-l2chb" event={"ID":"a09c6fad-26f7-4ea2-84fc-5d2efb86fd02","Type":"ContainerDied","Data":"9cef72375d217d58a4f49a8becffa8fc99ece346d358c56e97b42d8d61f1a2d3"} Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.521781 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cef72375d217d58a4f49a8becffa8fc99ece346d358c56e97b42d8d61f1a2d3" Mar 11 13:22:04 crc kubenswrapper[4816]: I0311 13:22:04.521810 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553922-l2chb" Mar 11 13:22:05 crc kubenswrapper[4816]: I0311 13:22:05.081013 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:22:05 crc kubenswrapper[4816]: I0311 13:22:05.088236 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553916-n8r8p"] Mar 11 13:22:06 crc kubenswrapper[4816]: I0311 13:22:06.146454 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7a49f0-ca01-4ad5-a353-5ac125523d95" path="/var/lib/kubelet/pods/ae7a49f0-ca01-4ad5-a353-5ac125523d95/volumes" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.008214 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:22:20 crc kubenswrapper[4816]: E0311 13:22:20.009123 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" containerName="oc" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.009139 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" containerName="oc" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.009400 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" containerName="oc" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.010314 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.012217 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rbcl7"/"kube-root-ca.crt" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.014459 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rbcl7"/"openshift-service-ca.crt" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.027115 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.048356 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.048407 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.150034 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.150090 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.150650 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.179320 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") pod \"must-gather-k8xh5\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.329341 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:22:20 crc kubenswrapper[4816]: I0311 13:22:20.754626 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:22:21 crc kubenswrapper[4816]: I0311 13:22:21.661212 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" event={"ID":"240c7704-66e9-4d5b-9b4f-cf8a80365c26","Type":"ContainerStarted","Data":"3114da44a220bd5bf16e3c17711dc438c13c8e76a81c57bfcca181e77302faaa"} Mar 11 13:22:27 crc kubenswrapper[4816]: I0311 13:22:27.729156 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" event={"ID":"240c7704-66e9-4d5b-9b4f-cf8a80365c26","Type":"ContainerStarted","Data":"54930770a8edcb6e0930ad6af1934aac20a12f0e3525f98e64859acac70909c5"} Mar 11 13:22:27 crc kubenswrapper[4816]: I0311 13:22:27.729737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" event={"ID":"240c7704-66e9-4d5b-9b4f-cf8a80365c26","Type":"ContainerStarted","Data":"275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996"} Mar 11 13:22:27 crc kubenswrapper[4816]: I0311 13:22:27.763774 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" podStartSLOduration=2.943973745 podStartE2EDuration="8.763752964s" podCreationTimestamp="2026-03-11 13:22:19 +0000 UTC" firstStartedPulling="2026-03-11 13:22:20.767890818 +0000 UTC m=+5027.359154785" lastFinishedPulling="2026-03-11 13:22:26.587670037 +0000 UTC m=+5033.178934004" observedRunningTime="2026-03-11 13:22:27.753533161 +0000 UTC m=+5034.344797148" watchObservedRunningTime="2026-03-11 13:22:27.763752964 +0000 UTC m=+5034.355016951" Mar 11 13:22:39 crc kubenswrapper[4816]: I0311 13:22:39.515469 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:22:39 crc kubenswrapper[4816]: I0311 13:22:39.515972 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:22:50 crc kubenswrapper[4816]: I0311 13:22:50.666308 4816 scope.go:117] "RemoveContainer" containerID="e4b94bbef2f14a1e765d933fe579ccf92b49db99e68b93650802fa89e27f09ad" Mar 11 13:23:09 crc kubenswrapper[4816]: I0311 13:23:09.515009 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:23:09 crc kubenswrapper[4816]: I0311 13:23:09.515498 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:23:32 crc kubenswrapper[4816]: I0311 13:23:32.636276 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-fjkn4_72237264-5d09-40bd-ba83-f30b76790cb6/manager/0.log" Mar 11 13:23:32 crc kubenswrapper[4816]: I0311 13:23:32.833928 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/util/0.log" Mar 11 13:23:32 crc kubenswrapper[4816]: I0311 13:23:32.990766 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/util/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.068491 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/pull/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.196714 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/pull/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.356301 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/util/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.378539 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/pull/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.659068 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vtksp_b89bbc79-4a51-434d-916c-bf02869be9cb/extract/0.log" Mar 11 13:23:33 crc kubenswrapper[4816]: I0311 13:23:33.914936 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-px2wm_c28c6622-633e-4e76-9c9a-eb732531fa1a/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.014880 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-66ctj_b941b0f1-4a8f-4517-af46-cc77892fe3d9/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.198792 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-8v46x_9e0c8832-9c20-44a9-933c-4a7fff032367/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.705075 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-874hd_f37fb9b3-7b07-4188-b9ea-facfa5e945f0/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.729704 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-hzd9q_a605e964-6e3c-4639-95d5-908f5d0ab7ef/manager/0.log" Mar 11 13:23:34 crc kubenswrapper[4816]: I0311 13:23:34.819385 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-g8cg2_6311ca5f-6f4c-4768-ae5e-75128be7f589/manager/0.log" Mar 11 13:23:35 crc kubenswrapper[4816]: I0311 13:23:35.264902 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-zczdq_73e00d02-6599-4cab-a32b-8fe96b82951a/manager/0.log" Mar 11 13:23:35 crc kubenswrapper[4816]: I0311 13:23:35.493131 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-bl9hm_bcfe1f90-2b5f-43b7-b798-0bad62ec53b2/manager/0.log" Mar 11 13:23:35 crc kubenswrapper[4816]: I0311 13:23:35.519521 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-wnsst_5d318732-8194-49eb-a2a3-c5b13ce843a7/manager/0.log" Mar 11 13:23:35 crc kubenswrapper[4816]: I0311 13:23:35.838114 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-h2vmc_4d4c74ff-52a2-4426-bd06-daa6e9b1a832/manager/0.log" Mar 11 13:23:36 crc kubenswrapper[4816]: I0311 13:23:36.336700 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-56fsw_b16cacfc-8fc3-444d-a2d7-6ffeaf8362d5/manager/0.log" Mar 11 13:23:36 crc kubenswrapper[4816]: I0311 13:23:36.495580 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-rxhkb_d1702062-37ba-43c0-becb-005e11f457a0/manager/0.log" Mar 11 13:23:36 crc kubenswrapper[4816]: I0311 13:23:36.597132 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c969dbbcd-ks52l_78a7aebd-70a2-4608-a669-aea496cb6186/manager/0.log" Mar 11 13:23:36 crc kubenswrapper[4816]: I0311 13:23:36.903773 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b9994cf8-zz7rl_0347df32-1ff0-463e-b073-077df8f41595/operator/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.197541 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zsrdm_4ed28d20-6f1f-4bb8-853d-284003a6b922/registry-server/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.457563 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-rr62t_6bbceab2-fe2b-4693-867d-aa2a51261611/manager/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.549339 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-h7kgb_e04ad395-8120-4c57-8575-611fa438e8fb/manager/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.643552 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dnqpf_8e810ef6-d3f5-4133-bce2-234df32b3d10/operator/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.845408 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-426qz_d7932403-615f-44e4-b195-4a83c19787ba/manager/0.log" Mar 11 13:23:37 crc kubenswrapper[4816]: I0311 13:23:37.890560 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7795b46f77-pt8n6_5f4b0b09-5704-432a-9cd4-82a296f3c467/manager/0.log" Mar 11 13:23:38 crc kubenswrapper[4816]: I0311 13:23:38.065278 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-7ldx8_0ddf91ff-6d91-4213-8032-05f80408063d/manager/0.log" Mar 11 13:23:38 crc kubenswrapper[4816]: I0311 13:23:38.073368 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-k2rnj_282f8f05-9a84-4bb4-a122-ba8806324ca3/manager/0.log" Mar 11 13:23:38 crc kubenswrapper[4816]: I0311 13:23:38.248287 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-kx9nz_4126be7d-7ca8-4e68-94d4-ea21644fbd85/manager/0.log" Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.514519 4816 patch_prober.go:28] interesting pod/machine-config-daemon-b4v82 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.514834 4816 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.514876 4816 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.515445 4816 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911"} pod="openshift-machine-config-operator/machine-config-daemon-b4v82" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 13:23:39 crc kubenswrapper[4816]: I0311 13:23:39.515490 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" containerName="machine-config-daemon" containerID="cri-o://1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" gracePeriod=600 Mar 11 13:23:39 crc kubenswrapper[4816]: E0311 13:23:39.634999 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:23:40 crc kubenswrapper[4816]: I0311 13:23:40.276239 4816 generic.go:334] "Generic (PLEG): container finished" podID="7fdff21c-644f-4443-a268-f98c91ea120a" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" exitCode=0 Mar 11 13:23:40 crc kubenswrapper[4816]: I0311 13:23:40.276281 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerDied","Data":"1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911"} Mar 11 13:23:40 crc kubenswrapper[4816]: I0311 13:23:40.276341 4816 scope.go:117] "RemoveContainer" containerID="ddd6136328dc7ec62752abe3735d43f3f986aeada7e2653f4b4a88d5e086c6c4" Mar 11 13:23:40 crc kubenswrapper[4816]: I0311 13:23:40.276862 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:23:40 crc kubenswrapper[4816]: E0311 13:23:40.277068 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:23:42 crc kubenswrapper[4816]: I0311 13:23:42.277659 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-rb228_a8133b64-eb11-43ad-bf6e-a278af0ff466/manager/0.log" Mar 11 13:23:55 crc kubenswrapper[4816]: I0311 13:23:55.131371 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:23:55 crc kubenswrapper[4816]: E0311 13:23:55.132111 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:23:59 crc kubenswrapper[4816]: I0311 13:23:59.487004 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ksjm4_db49f265-44d3-468b-8e2f-2246b02b57be/control-plane-machine-set-operator/0.log" Mar 11 13:23:59 crc kubenswrapper[4816]: I0311 13:23:59.576063 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t5t6b_cf7eaa86-2d32-4321-9016-e785320de3e2/kube-rbac-proxy/0.log" Mar 11 13:23:59 crc kubenswrapper[4816]: I0311 13:23:59.630348 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t5t6b_cf7eaa86-2d32-4321-9016-e785320de3e2/machine-api-operator/0.log" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.149471 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553924-rvhwv"] Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.150340 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.153040 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.153939 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.157220 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.170527 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553924-rvhwv"] Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.319277 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") pod \"auto-csr-approver-29553924-rvhwv\" (UID: \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\") " pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.420665 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") pod \"auto-csr-approver-29553924-rvhwv\" (UID: \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\") " pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.439458 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") pod \"auto-csr-approver-29553924-rvhwv\" (UID: \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\") " pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.473651 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:00 crc kubenswrapper[4816]: I0311 13:24:00.902594 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553924-rvhwv"] Mar 11 13:24:01 crc kubenswrapper[4816]: I0311 13:24:01.436013 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" event={"ID":"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314","Type":"ContainerStarted","Data":"ce200279374d5c25974cc9bda3280802f87c35f6ee351a2d31cadbe9da998a81"} Mar 11 13:24:03 crc kubenswrapper[4816]: I0311 13:24:03.458616 4816 generic.go:334] "Generic (PLEG): container finished" podID="c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" containerID="289319f8c74a3f6941e1372e90484c85b50d9f435ddf8b7c0a56ed2e3b71fb7c" exitCode=0 Mar 11 13:24:03 crc kubenswrapper[4816]: I0311 13:24:03.458818 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" event={"ID":"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314","Type":"ContainerDied","Data":"289319f8c74a3f6941e1372e90484c85b50d9f435ddf8b7c0a56ed2e3b71fb7c"} Mar 11 13:24:04 crc kubenswrapper[4816]: I0311 13:24:04.854107 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:04 crc kubenswrapper[4816]: I0311 13:24:04.998485 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") pod \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\" (UID: \"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314\") " Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.431190 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw" (OuterVolumeSpecName: "kube-api-access-lbrmw") pod "c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" (UID: "c5e898cc-ff3f-4b4e-8fd4-4a85d3934314"). InnerVolumeSpecName "kube-api-access-lbrmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.484889 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" event={"ID":"c5e898cc-ff3f-4b4e-8fd4-4a85d3934314","Type":"ContainerDied","Data":"ce200279374d5c25974cc9bda3280802f87c35f6ee351a2d31cadbe9da998a81"} Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.484947 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce200279374d5c25974cc9bda3280802f87c35f6ee351a2d31cadbe9da998a81" Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.485011 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553924-rvhwv" Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.514535 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbrmw\" (UniqueName: \"kubernetes.io/projected/c5e898cc-ff3f-4b4e-8fd4-4a85d3934314-kube-api-access-lbrmw\") on node \"crc\" DevicePath \"\"" Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.937961 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:24:05 crc kubenswrapper[4816]: I0311 13:24:05.944655 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553918-7rzs7"] Mar 11 13:24:06 crc kubenswrapper[4816]: I0311 13:24:06.142630 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95b7b2b-acc3-47bd-b762-29e39ca68f93" path="/var/lib/kubelet/pods/d95b7b2b-acc3-47bd-b762-29e39ca68f93/volumes" Mar 11 13:24:10 crc kubenswrapper[4816]: I0311 13:24:10.131059 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:24:10 crc kubenswrapper[4816]: E0311 13:24:10.131652 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:24:14 crc kubenswrapper[4816]: I0311 13:24:14.708899 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-62cp5_e50b3f6b-4679-4337-a9cf-478aa2fb5800/cert-manager-controller/0.log" Mar 11 13:24:14 crc kubenswrapper[4816]: I0311 13:24:14.905105 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-fgzw7_f7a5fee8-e8c0-47ae-b730-cf5c1d7133c8/cert-manager-cainjector/0.log" Mar 11 13:24:14 crc kubenswrapper[4816]: I0311 13:24:14.909805 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-2jk7k_0a41e6b9-3b80-4eed-a8db-65aa010f449d/cert-manager-webhook/0.log" Mar 11 13:24:22 crc kubenswrapper[4816]: I0311 13:24:22.131381 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:24:22 crc kubenswrapper[4816]: E0311 13:24:22.132148 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.075288 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-px2gk_a822f6ee-e723-4f64-b4f6-c948dc948359/nmstate-console-plugin/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.252900 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-47rs2_fe3fb536-d8aa-4415-b66e-3fd6dc2ecba9/nmstate-handler/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.304637 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-2snpd_7fb0dcd0-9411-49d6-a997-79d2099b2462/kube-rbac-proxy/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.347771 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-2snpd_7fb0dcd0-9411-49d6-a997-79d2099b2462/nmstate-metrics/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.943478 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-g59xq_c1f09ebe-c0e1-415c-9ea9-42fc42240e94/nmstate-operator/0.log" Mar 11 13:24:29 crc kubenswrapper[4816]: I0311 13:24:29.992016 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-xq48v_1b664fad-a0fa-4442-bed2-3316eafbb78c/nmstate-webhook/0.log" Mar 11 13:24:37 crc kubenswrapper[4816]: I0311 13:24:37.129904 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:24:37 crc kubenswrapper[4816]: E0311 13:24:37.130578 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:24:50 crc kubenswrapper[4816]: I0311 13:24:50.130826 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:24:50 crc kubenswrapper[4816]: E0311 13:24:50.131800 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:24:50 crc kubenswrapper[4816]: I0311 13:24:50.771797 4816 scope.go:117] "RemoveContainer" containerID="740acfe6fc04d23ba8749fd0de9541e5bd0ee02db427a2bd65a7b93925e05ec4" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.365798 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-srnjf_2af0656a-169d-42fe-8efb-5258bc56af56/kube-rbac-proxy/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.586699 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-frr-files/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.760556 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-srnjf_2af0656a-169d-42fe-8efb-5258bc56af56/controller/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.802488 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-frr-files/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.817843 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-reloader/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.832672 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-metrics/0.log" Mar 11 13:25:00 crc kubenswrapper[4816]: I0311 13:25:00.964260 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-reloader/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.116911 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-reloader/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.117100 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-frr-files/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.164395 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-metrics/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.176607 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-metrics/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.328718 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-frr-files/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.346831 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-metrics/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.367787 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/cp-reloader/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.395217 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/controller/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.537749 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/frr-metrics/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.576956 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/kube-rbac-proxy/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.596264 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/kube-rbac-proxy-frr/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.707378 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/reloader/0.log" Mar 11 13:25:01 crc kubenswrapper[4816]: I0311 13:25:01.853380 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-h8scg_6512814f-09cf-4b97-a1d6-ec99bcbf1525/frr-k8s-webhook-server/0.log" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.030546 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5679b59769-8stwg_7f7c9c4d-3a3f-4524-8964-8a99f24c2786/manager/0.log" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.127598 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-96bb59846-7z5mz_72342d10-d8c0-4f04-9554-e57c84d77653/webhook-server/0.log" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.130382 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:02 crc kubenswrapper[4816]: E0311 13:25:02.130629 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.271074 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wqwrt_43ec0f0d-8425-4dc4-9aa2-f1f85a26548c/kube-rbac-proxy/0.log" Mar 11 13:25:02 crc kubenswrapper[4816]: I0311 13:25:02.712101 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wqwrt_43ec0f0d-8425-4dc4-9aa2-f1f85a26548c/speaker/0.log" Mar 11 13:25:03 crc kubenswrapper[4816]: I0311 13:25:03.004554 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjfwg_00616041-f382-4b2a-a7ef-b75a14621ce1/frr/0.log" Mar 11 13:25:13 crc kubenswrapper[4816]: I0311 13:25:13.129919 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:13 crc kubenswrapper[4816]: E0311 13:25:13.130682 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.319340 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/util/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.515059 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/pull/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.547105 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/util/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.564211 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/pull/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.724212 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/extract/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.725034 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/pull/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.743216 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g6wjp_2a03942f-8b0e-4041-8843-ad5e6cedc6b0/util/0.log" Mar 11 13:25:17 crc kubenswrapper[4816]: I0311 13:25:17.871460 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.028615 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.032241 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/pull/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.055451 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/pull/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.521880 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/pull/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.570181 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/extract/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.591340 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bm7w8_5dff60f3-3acf-4dfb-9098-917736f61c0c/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.710462 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.886457 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/pull/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.939094 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/util/0.log" Mar 11 13:25:18 crc kubenswrapper[4816]: I0311 13:25:18.957148 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/pull/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.115030 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/util/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.124105 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/pull/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.137676 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5sxtp2_f9c6bbc7-62af-4c3a-ac05-1897b9f00080/extract/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.273435 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-utilities/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.469357 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-content/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.493023 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-utilities/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.503870 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-content/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.938641 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-content/0.log" Mar 11 13:25:19 crc kubenswrapper[4816]: I0311 13:25:19.993011 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/extract-utilities/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.161613 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-utilities/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.323108 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-82vz9_2b81c3bf-499d-48bd-869b-671fefa1ba81/registry-server/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.353174 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-content/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.395498 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-content/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.397609 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-utilities/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.554960 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-utilities/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.643484 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/extract-content/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.744729 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2l2xd_a13e0873-9c6c-46d7-b0bf-4ef50c40a918/registry-server/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.782780 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m586v_e86ee6f4-c5ee-40dd-8e60-977add936dc1/marketplace-operator/0.log" Mar 11 13:25:20 crc kubenswrapper[4816]: I0311 13:25:20.854553 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.043029 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.052018 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.068528 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.368813 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.383870 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.425623 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.479752 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9jq_991327ed-0ad5-4161-a218-598e50bbafe9/registry-server/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.589871 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.590872 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-utilities/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.614136 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.764524 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-content/0.log" Mar 11 13:25:21 crc kubenswrapper[4816]: I0311 13:25:21.784543 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/extract-utilities/0.log" Mar 11 13:25:22 crc kubenswrapper[4816]: I0311 13:25:22.420230 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4czr8_08bf2596-9393-42d3-9b76-461be3ee0c22/registry-server/0.log" Mar 11 13:25:27 crc kubenswrapper[4816]: I0311 13:25:27.130327 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:27 crc kubenswrapper[4816]: E0311 13:25:27.130873 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:38 crc kubenswrapper[4816]: I0311 13:25:38.131159 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:38 crc kubenswrapper[4816]: E0311 13:25:38.132076 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:50 crc kubenswrapper[4816]: I0311 13:25:50.131557 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:25:50 crc kubenswrapper[4816]: E0311 13:25:50.132558 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:25:50 crc kubenswrapper[4816]: I0311 13:25:50.853753 4816 scope.go:117] "RemoveContainer" containerID="3c47893cabbfc635edaea2ea48266ffc815a61e1e094878326c38fe6119ee1b9" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.159019 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553926-z2j68"] Mar 11 13:26:00 crc kubenswrapper[4816]: E0311 13:26:00.160093 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" containerName="oc" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.160115 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" containerName="oc" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.160785 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e898cc-ff3f-4b4e-8fd4-4a85d3934314" containerName="oc" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.161486 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.165350 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.165564 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.165970 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.183753 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") pod \"auto-csr-approver-29553926-z2j68\" (UID: \"bb680b80-e315-429b-abf6-ff316b5086d2\") " pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.184042 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553926-z2j68"] Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.285348 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") pod \"auto-csr-approver-29553926-z2j68\" (UID: \"bb680b80-e315-429b-abf6-ff316b5086d2\") " pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.309598 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") pod \"auto-csr-approver-29553926-z2j68\" (UID: \"bb680b80-e315-429b-abf6-ff316b5086d2\") " pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.499636 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:00 crc kubenswrapper[4816]: I0311 13:26:00.788077 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553926-z2j68"] Mar 11 13:26:01 crc kubenswrapper[4816]: I0311 13:26:01.415529 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553926-z2j68" event={"ID":"bb680b80-e315-429b-abf6-ff316b5086d2","Type":"ContainerStarted","Data":"65541b39ae0514d6122a7c1a032d1898b30e5c799b8a8100a8cab098ffcc87ce"} Mar 11 13:26:03 crc kubenswrapper[4816]: I0311 13:26:03.436074 4816 generic.go:334] "Generic (PLEG): container finished" podID="bb680b80-e315-429b-abf6-ff316b5086d2" containerID="2b01fcc246d768dc9f3a808039fe09e1a0fa481d4f86fec7e3628f8562f3e719" exitCode=0 Mar 11 13:26:03 crc kubenswrapper[4816]: I0311 13:26:03.436267 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553926-z2j68" event={"ID":"bb680b80-e315-429b-abf6-ff316b5086d2","Type":"ContainerDied","Data":"2b01fcc246d768dc9f3a808039fe09e1a0fa481d4f86fec7e3628f8562f3e719"} Mar 11 13:26:04 crc kubenswrapper[4816]: I0311 13:26:04.807672 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:04 crc kubenswrapper[4816]: I0311 13:26:04.857760 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") pod \"bb680b80-e315-429b-abf6-ff316b5086d2\" (UID: \"bb680b80-e315-429b-abf6-ff316b5086d2\") " Mar 11 13:26:04 crc kubenswrapper[4816]: I0311 13:26:04.863220 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t" (OuterVolumeSpecName: "kube-api-access-gn46t") pod "bb680b80-e315-429b-abf6-ff316b5086d2" (UID: "bb680b80-e315-429b-abf6-ff316b5086d2"). InnerVolumeSpecName "kube-api-access-gn46t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:26:04 crc kubenswrapper[4816]: I0311 13:26:04.959944 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn46t\" (UniqueName: \"kubernetes.io/projected/bb680b80-e315-429b-abf6-ff316b5086d2-kube-api-access-gn46t\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.131497 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:05 crc kubenswrapper[4816]: E0311 13:26:05.131886 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.455381 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553926-z2j68" event={"ID":"bb680b80-e315-429b-abf6-ff316b5086d2","Type":"ContainerDied","Data":"65541b39ae0514d6122a7c1a032d1898b30e5c799b8a8100a8cab098ffcc87ce"} Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.455433 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553926-z2j68" Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.455444 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65541b39ae0514d6122a7c1a032d1898b30e5c799b8a8100a8cab098ffcc87ce" Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.884003 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:26:05 crc kubenswrapper[4816]: I0311 13:26:05.895121 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553920-h6hh5"] Mar 11 13:26:06 crc kubenswrapper[4816]: I0311 13:26:06.148244 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="193e1468-2f5b-4e66-94f3-a7fc184c7e01" path="/var/lib/kubelet/pods/193e1468-2f5b-4e66-94f3-a7fc184c7e01/volumes" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.774669 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:09 crc kubenswrapper[4816]: E0311 13:26:09.775564 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb680b80-e315-429b-abf6-ff316b5086d2" containerName="oc" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.775592 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb680b80-e315-429b-abf6-ff316b5086d2" containerName="oc" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.775915 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb680b80-e315-429b-abf6-ff316b5086d2" containerName="oc" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.777944 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.809665 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.944063 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.944141 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:09 crc kubenswrapper[4816]: I0311 13:26:09.944233 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045208 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045324 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045349 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045853 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.045907 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.084365 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") pod \"certified-operators-h5lmg\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.111055 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:10 crc kubenswrapper[4816]: I0311 13:26:10.660358 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:11 crc kubenswrapper[4816]: I0311 13:26:11.507481 4816 generic.go:334] "Generic (PLEG): container finished" podID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerID="d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006" exitCode=0 Mar 11 13:26:11 crc kubenswrapper[4816]: I0311 13:26:11.507631 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerDied","Data":"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006"} Mar 11 13:26:11 crc kubenswrapper[4816]: I0311 13:26:11.507897 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerStarted","Data":"88705377053d43154d3881339c36623191d31a836df133a4f330805ec483a271"} Mar 11 13:26:12 crc kubenswrapper[4816]: I0311 13:26:12.523546 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerStarted","Data":"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378"} Mar 11 13:26:13 crc kubenswrapper[4816]: I0311 13:26:13.538657 4816 generic.go:334] "Generic (PLEG): container finished" podID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerID="4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378" exitCode=0 Mar 11 13:26:13 crc kubenswrapper[4816]: I0311 13:26:13.538714 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerDied","Data":"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378"} Mar 11 13:26:15 crc kubenswrapper[4816]: I0311 13:26:15.566937 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerStarted","Data":"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd"} Mar 11 13:26:15 crc kubenswrapper[4816]: I0311 13:26:15.592101 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h5lmg" podStartSLOduration=3.562510782 podStartE2EDuration="6.592085642s" podCreationTimestamp="2026-03-11 13:26:09 +0000 UTC" firstStartedPulling="2026-03-11 13:26:11.512345727 +0000 UTC m=+5258.103609724" lastFinishedPulling="2026-03-11 13:26:14.541920607 +0000 UTC m=+5261.133184584" observedRunningTime="2026-03-11 13:26:15.587636507 +0000 UTC m=+5262.178900474" watchObservedRunningTime="2026-03-11 13:26:15.592085642 +0000 UTC m=+5262.183349609" Mar 11 13:26:17 crc kubenswrapper[4816]: I0311 13:26:17.131712 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:17 crc kubenswrapper[4816]: E0311 13:26:17.132576 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.111218 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.111669 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.191460 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.668137 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:20 crc kubenswrapper[4816]: I0311 13:26:20.743696 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:22 crc kubenswrapper[4816]: I0311 13:26:22.653987 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h5lmg" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="registry-server" containerID="cri-o://1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" gracePeriod=2 Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.097831 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.235123 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") pod \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.235327 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") pod \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.235385 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") pod \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\" (UID: \"f5ab741b-37be-41a8-ac90-39c44e1c3cce\") " Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.236306 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities" (OuterVolumeSpecName: "utilities") pod "f5ab741b-37be-41a8-ac90-39c44e1c3cce" (UID: "f5ab741b-37be-41a8-ac90-39c44e1c3cce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.241479 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8" (OuterVolumeSpecName: "kube-api-access-ksqf8") pod "f5ab741b-37be-41a8-ac90-39c44e1c3cce" (UID: "f5ab741b-37be-41a8-ac90-39c44e1c3cce"). InnerVolumeSpecName "kube-api-access-ksqf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.337238 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.337629 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksqf8\" (UniqueName: \"kubernetes.io/projected/f5ab741b-37be-41a8-ac90-39c44e1c3cce-kube-api-access-ksqf8\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.658606 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5ab741b-37be-41a8-ac90-39c44e1c3cce" (UID: "f5ab741b-37be-41a8-ac90-39c44e1c3cce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.667431 4816 generic.go:334] "Generic (PLEG): container finished" podID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerID="1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" exitCode=0 Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.667515 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5lmg" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.667526 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerDied","Data":"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd"} Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.667969 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5lmg" event={"ID":"f5ab741b-37be-41a8-ac90-39c44e1c3cce","Type":"ContainerDied","Data":"88705377053d43154d3881339c36623191d31a836df133a4f330805ec483a271"} Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.668015 4816 scope.go:117] "RemoveContainer" containerID="1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.699791 4816 scope.go:117] "RemoveContainer" containerID="4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.731871 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.742305 4816 scope.go:117] "RemoveContainer" containerID="d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.742929 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5ab741b-37be-41a8-ac90-39c44e1c3cce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.749458 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h5lmg"] Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.774316 4816 scope.go:117] "RemoveContainer" containerID="1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" Mar 11 13:26:23 crc kubenswrapper[4816]: E0311 13:26:23.775929 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd\": container with ID starting with 1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd not found: ID does not exist" containerID="1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.776016 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd"} err="failed to get container status \"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd\": rpc error: code = NotFound desc = could not find container \"1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd\": container with ID starting with 1c78198eaa5ef01dd82d28c6d1f5ea83a3d3649257a14cd5bfe7c73ebd2c7efd not found: ID does not exist" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.776062 4816 scope.go:117] "RemoveContainer" containerID="4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378" Mar 11 13:26:23 crc kubenswrapper[4816]: E0311 13:26:23.776659 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378\": container with ID starting with 4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378 not found: ID does not exist" containerID="4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.776720 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378"} err="failed to get container status \"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378\": rpc error: code = NotFound desc = could not find container \"4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378\": container with ID starting with 4ee85a7d424d8c2112523f9f25b5234e252e3cb4c06332ee42ac3e00fbed9378 not found: ID does not exist" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.776771 4816 scope.go:117] "RemoveContainer" containerID="d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006" Mar 11 13:26:23 crc kubenswrapper[4816]: E0311 13:26:23.777308 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006\": container with ID starting with d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006 not found: ID does not exist" containerID="d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006" Mar 11 13:26:23 crc kubenswrapper[4816]: I0311 13:26:23.777409 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006"} err="failed to get container status \"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006\": rpc error: code = NotFound desc = could not find container \"d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006\": container with ID starting with d7ecaa0f879577606667ec6d103914507b4b06cfec60b28fc41f33027ad79006 not found: ID does not exist" Mar 11 13:26:24 crc kubenswrapper[4816]: I0311 13:26:24.149780 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" path="/var/lib/kubelet/pods/f5ab741b-37be-41a8-ac90-39c44e1c3cce/volumes" Mar 11 13:26:30 crc kubenswrapper[4816]: I0311 13:26:30.130773 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:30 crc kubenswrapper[4816]: E0311 13:26:30.131424 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:36 crc kubenswrapper[4816]: I0311 13:26:36.702908 4816 generic.go:334] "Generic (PLEG): container finished" podID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerID="275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996" exitCode=0 Mar 11 13:26:36 crc kubenswrapper[4816]: I0311 13:26:36.703079 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" event={"ID":"240c7704-66e9-4d5b-9b4f-cf8a80365c26","Type":"ContainerDied","Data":"275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996"} Mar 11 13:26:36 crc kubenswrapper[4816]: I0311 13:26:36.704282 4816 scope.go:117] "RemoveContainer" containerID="275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996" Mar 11 13:26:37 crc kubenswrapper[4816]: I0311 13:26:37.421077 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbcl7_must-gather-k8xh5_240c7704-66e9-4d5b-9b4f-cf8a80365c26/gather/0.log" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.265108 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:26:39 crc kubenswrapper[4816]: E0311 13:26:39.266343 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="registry-server" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.266360 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="registry-server" Mar 11 13:26:39 crc kubenswrapper[4816]: E0311 13:26:39.266372 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="extract-content" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.266378 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="extract-content" Mar 11 13:26:39 crc kubenswrapper[4816]: E0311 13:26:39.266406 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="extract-utilities" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.266413 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="extract-utilities" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.266575 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ab741b-37be-41a8-ac90-39c44e1c3cce" containerName="registry-server" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.267872 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.279586 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.425375 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.425489 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.425803 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527217 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527309 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527377 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527794 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.527875 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.563054 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") pod \"redhat-operators-jq5z6\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.592660 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:39 crc kubenswrapper[4816]: I0311 13:26:39.869859 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:26:40 crc kubenswrapper[4816]: I0311 13:26:40.742433 4816 generic.go:334] "Generic (PLEG): container finished" podID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerID="5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92" exitCode=0 Mar 11 13:26:40 crc kubenswrapper[4816]: I0311 13:26:40.742545 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerDied","Data":"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92"} Mar 11 13:26:40 crc kubenswrapper[4816]: I0311 13:26:40.742677 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerStarted","Data":"b5442a18316c3fb9c79391159649faad4dab8cdd84c8cce704995af04a204fca"} Mar 11 13:26:41 crc kubenswrapper[4816]: I0311 13:26:41.754872 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerStarted","Data":"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573"} Mar 11 13:26:42 crc kubenswrapper[4816]: I0311 13:26:42.768296 4816 generic.go:334] "Generic (PLEG): container finished" podID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerID="22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573" exitCode=0 Mar 11 13:26:42 crc kubenswrapper[4816]: I0311 13:26:42.768593 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerDied","Data":"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573"} Mar 11 13:26:43 crc kubenswrapper[4816]: I0311 13:26:43.785506 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerStarted","Data":"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3"} Mar 11 13:26:43 crc kubenswrapper[4816]: I0311 13:26:43.815988 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jq5z6" podStartSLOduration=2.306411344 podStartE2EDuration="4.815965128s" podCreationTimestamp="2026-03-11 13:26:39 +0000 UTC" firstStartedPulling="2026-03-11 13:26:40.74404604 +0000 UTC m=+5287.335310007" lastFinishedPulling="2026-03-11 13:26:43.253599814 +0000 UTC m=+5289.844863791" observedRunningTime="2026-03-11 13:26:43.810951907 +0000 UTC m=+5290.402215944" watchObservedRunningTime="2026-03-11 13:26:43.815965128 +0000 UTC m=+5290.407229115" Mar 11 13:26:44 crc kubenswrapper[4816]: I0311 13:26:44.135654 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:44 crc kubenswrapper[4816]: E0311 13:26:44.136102 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.526617 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.527604 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="copy" containerID="cri-o://54930770a8edcb6e0930ad6af1934aac20a12f0e3525f98e64859acac70909c5" gracePeriod=2 Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.535315 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rbcl7/must-gather-k8xh5"] Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.813861 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbcl7_must-gather-k8xh5_240c7704-66e9-4d5b-9b4f-cf8a80365c26/copy/0.log" Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.814549 4816 generic.go:334] "Generic (PLEG): container finished" podID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerID="54930770a8edcb6e0930ad6af1934aac20a12f0e3525f98e64859acac70909c5" exitCode=143 Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.972853 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbcl7_must-gather-k8xh5_240c7704-66e9-4d5b-9b4f-cf8a80365c26/copy/0.log" Mar 11 13:26:45 crc kubenswrapper[4816]: I0311 13:26:45.973390 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.122318 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") pod \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.122404 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") pod \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\" (UID: \"240c7704-66e9-4d5b-9b4f-cf8a80365c26\") " Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.143217 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7" (OuterVolumeSpecName: "kube-api-access-g4wc7") pod "240c7704-66e9-4d5b-9b4f-cf8a80365c26" (UID: "240c7704-66e9-4d5b-9b4f-cf8a80365c26"). InnerVolumeSpecName "kube-api-access-g4wc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.224620 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4wc7\" (UniqueName: \"kubernetes.io/projected/240c7704-66e9-4d5b-9b4f-cf8a80365c26-kube-api-access-g4wc7\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.231158 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "240c7704-66e9-4d5b-9b4f-cf8a80365c26" (UID: "240c7704-66e9-4d5b-9b4f-cf8a80365c26"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.326076 4816 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/240c7704-66e9-4d5b-9b4f-cf8a80365c26-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.825055 4816 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rbcl7_must-gather-k8xh5_240c7704-66e9-4d5b-9b4f-cf8a80365c26/copy/0.log" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.826418 4816 scope.go:117] "RemoveContainer" containerID="54930770a8edcb6e0930ad6af1934aac20a12f0e3525f98e64859acac70909c5" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.826553 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rbcl7/must-gather-k8xh5" Mar 11 13:26:46 crc kubenswrapper[4816]: I0311 13:26:46.843954 4816 scope.go:117] "RemoveContainer" containerID="275f5d5dbbc9a09455fcf3424925e09fcf25e8e9a31a9d4c2991fb94c2921996" Mar 11 13:26:48 crc kubenswrapper[4816]: I0311 13:26:48.141001 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" path="/var/lib/kubelet/pods/240c7704-66e9-4d5b-9b4f-cf8a80365c26/volumes" Mar 11 13:26:49 crc kubenswrapper[4816]: I0311 13:26:49.592935 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:49 crc kubenswrapper[4816]: I0311 13:26:49.593491 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:50 crc kubenswrapper[4816]: I0311 13:26:50.640771 4816 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jq5z6" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" probeResult="failure" output=< Mar 11 13:26:50 crc kubenswrapper[4816]: timeout: failed to connect service ":50051" within 1s Mar 11 13:26:50 crc kubenswrapper[4816]: > Mar 11 13:26:51 crc kubenswrapper[4816]: I0311 13:26:51.319384 4816 scope.go:117] "RemoveContainer" containerID="4a3cade9d3e8a7bb5a9e71032c96de36c1178b4f7d16ddbd543510f67b6be155" Mar 11 13:26:55 crc kubenswrapper[4816]: I0311 13:26:55.130799 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:26:55 crc kubenswrapper[4816]: E0311 13:26:55.131704 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:26:59 crc kubenswrapper[4816]: I0311 13:26:59.656580 4816 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:59 crc kubenswrapper[4816]: I0311 13:26:59.731376 4816 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:26:59 crc kubenswrapper[4816]: I0311 13:26:59.900100 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:27:00 crc kubenswrapper[4816]: I0311 13:27:00.954200 4816 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jq5z6" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" containerID="cri-o://5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" gracePeriod=2 Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.385288 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.560437 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") pod \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.560527 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") pod \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.560661 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") pod \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\" (UID: \"34d81fa0-710a-4fdd-b98b-bd88b80a7343\") " Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.561959 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities" (OuterVolumeSpecName: "utilities") pod "34d81fa0-710a-4fdd-b98b-bd88b80a7343" (UID: "34d81fa0-710a-4fdd-b98b-bd88b80a7343"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.569157 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l" (OuterVolumeSpecName: "kube-api-access-bfv5l") pod "34d81fa0-710a-4fdd-b98b-bd88b80a7343" (UID: "34d81fa0-710a-4fdd-b98b-bd88b80a7343"). InnerVolumeSpecName "kube-api-access-bfv5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.662097 4816 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.662141 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfv5l\" (UniqueName: \"kubernetes.io/projected/34d81fa0-710a-4fdd-b98b-bd88b80a7343-kube-api-access-bfv5l\") on node \"crc\" DevicePath \"\"" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.727122 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34d81fa0-710a-4fdd-b98b-bd88b80a7343" (UID: "34d81fa0-710a-4fdd-b98b-bd88b80a7343"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.763754 4816 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d81fa0-710a-4fdd-b98b-bd88b80a7343-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966669 4816 generic.go:334] "Generic (PLEG): container finished" podID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerID="5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" exitCode=0 Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966737 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerDied","Data":"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3"} Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966782 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jq5z6" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966806 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jq5z6" event={"ID":"34d81fa0-710a-4fdd-b98b-bd88b80a7343","Type":"ContainerDied","Data":"b5442a18316c3fb9c79391159649faad4dab8cdd84c8cce704995af04a204fca"} Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.966830 4816 scope.go:117] "RemoveContainer" containerID="5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" Mar 11 13:27:01 crc kubenswrapper[4816]: I0311 13:27:01.986659 4816 scope.go:117] "RemoveContainer" containerID="22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.018901 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.021482 4816 scope.go:117] "RemoveContainer" containerID="5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.026765 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jq5z6"] Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.042610 4816 scope.go:117] "RemoveContainer" containerID="5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" Mar 11 13:27:02 crc kubenswrapper[4816]: E0311 13:27:02.042950 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3\": container with ID starting with 5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3 not found: ID does not exist" containerID="5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.042980 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3"} err="failed to get container status \"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3\": rpc error: code = NotFound desc = could not find container \"5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3\": container with ID starting with 5fe5120c5eb97d31c12a15c7394050040b0f13cf69682c7d15d1dfaccbafc5a3 not found: ID does not exist" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.043001 4816 scope.go:117] "RemoveContainer" containerID="22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573" Mar 11 13:27:02 crc kubenswrapper[4816]: E0311 13:27:02.043366 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573\": container with ID starting with 22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573 not found: ID does not exist" containerID="22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.043388 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573"} err="failed to get container status \"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573\": rpc error: code = NotFound desc = could not find container \"22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573\": container with ID starting with 22ca4a09876d0901416bc035c5e08aae69a625ba3ccffe89562409bcb4433573 not found: ID does not exist" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.043401 4816 scope.go:117] "RemoveContainer" containerID="5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92" Mar 11 13:27:02 crc kubenswrapper[4816]: E0311 13:27:02.043778 4816 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92\": container with ID starting with 5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92 not found: ID does not exist" containerID="5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.043799 4816 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92"} err="failed to get container status \"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92\": rpc error: code = NotFound desc = could not find container \"5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92\": container with ID starting with 5fa6520a596bfb3b937463bb486971345dd932a0c909ffeb49e0c2fe56ac3e92 not found: ID does not exist" Mar 11 13:27:02 crc kubenswrapper[4816]: I0311 13:27:02.141421 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" path="/var/lib/kubelet/pods/34d81fa0-710a-4fdd-b98b-bd88b80a7343/volumes" Mar 11 13:27:10 crc kubenswrapper[4816]: I0311 13:27:10.135418 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:27:10 crc kubenswrapper[4816]: E0311 13:27:10.136375 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:27:23 crc kubenswrapper[4816]: I0311 13:27:23.130946 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:27:23 crc kubenswrapper[4816]: E0311 13:27:23.131852 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:27:36 crc kubenswrapper[4816]: I0311 13:27:36.130433 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:27:36 crc kubenswrapper[4816]: E0311 13:27:36.131338 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:27:49 crc kubenswrapper[4816]: I0311 13:27:49.130379 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:27:49 crc kubenswrapper[4816]: E0311 13:27:49.131296 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.156854 4816 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553928-j48rh"] Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157628 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="extract-content" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157640 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="extract-content" Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157651 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="copy" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157657 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="copy" Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157670 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157676 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157685 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="gather" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157691 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="gather" Mar 11 13:28:00 crc kubenswrapper[4816]: E0311 13:28:00.157702 4816 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="extract-utilities" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157707 4816 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="extract-utilities" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157863 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="copy" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157879 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="240c7704-66e9-4d5b-9b4f-cf8a80365c26" containerName="gather" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.157890 4816 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d81fa0-710a-4fdd-b98b-bd88b80a7343" containerName="registry-server" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.158297 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.161084 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.161301 4816 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-58r5h" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.161369 4816 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.183079 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553928-j48rh"] Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.294440 4816 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") pod \"auto-csr-approver-29553928-j48rh\" (UID: \"31fec07e-a834-4a80-9534-cfa4b1939ffc\") " pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.395762 4816 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") pod \"auto-csr-approver-29553928-j48rh\" (UID: \"31fec07e-a834-4a80-9534-cfa4b1939ffc\") " pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.429286 4816 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") pod \"auto-csr-approver-29553928-j48rh\" (UID: \"31fec07e-a834-4a80-9534-cfa4b1939ffc\") " pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.486872 4816 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.816078 4816 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553928-j48rh"] Mar 11 13:28:00 crc kubenswrapper[4816]: I0311 13:28:00.824171 4816 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 13:28:01 crc kubenswrapper[4816]: I0311 13:28:01.534295 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553928-j48rh" event={"ID":"31fec07e-a834-4a80-9534-cfa4b1939ffc","Type":"ContainerStarted","Data":"e22f86625762b25fb3daad30bead92f87f9412d99c562ef147e5e5a37a8d3809"} Mar 11 13:28:02 crc kubenswrapper[4816]: I0311 13:28:02.541229 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553928-j48rh" event={"ID":"31fec07e-a834-4a80-9534-cfa4b1939ffc","Type":"ContainerStarted","Data":"2d431be4a15d84bda7a012602744dbc22889b621168f3e90771a1c976b8807e5"} Mar 11 13:28:02 crc kubenswrapper[4816]: I0311 13:28:02.556774 4816 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553928-j48rh" podStartSLOduration=1.299542255 podStartE2EDuration="2.556756155s" podCreationTimestamp="2026-03-11 13:28:00 +0000 UTC" firstStartedPulling="2026-03-11 13:28:00.823795273 +0000 UTC m=+5367.415059230" lastFinishedPulling="2026-03-11 13:28:02.081009133 +0000 UTC m=+5368.672273130" observedRunningTime="2026-03-11 13:28:02.554692817 +0000 UTC m=+5369.145956784" watchObservedRunningTime="2026-03-11 13:28:02.556756155 +0000 UTC m=+5369.148020122" Mar 11 13:28:03 crc kubenswrapper[4816]: I0311 13:28:03.551841 4816 generic.go:334] "Generic (PLEG): container finished" podID="31fec07e-a834-4a80-9534-cfa4b1939ffc" containerID="2d431be4a15d84bda7a012602744dbc22889b621168f3e90771a1c976b8807e5" exitCode=0 Mar 11 13:28:03 crc kubenswrapper[4816]: I0311 13:28:03.551910 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553928-j48rh" event={"ID":"31fec07e-a834-4a80-9534-cfa4b1939ffc","Type":"ContainerDied","Data":"2d431be4a15d84bda7a012602744dbc22889b621168f3e90771a1c976b8807e5"} Mar 11 13:28:04 crc kubenswrapper[4816]: I0311 13:28:04.139711 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:28:04 crc kubenswrapper[4816]: E0311 13:28:04.140415 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:28:04 crc kubenswrapper[4816]: I0311 13:28:04.948814 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.071733 4816 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") pod \"31fec07e-a834-4a80-9534-cfa4b1939ffc\" (UID: \"31fec07e-a834-4a80-9534-cfa4b1939ffc\") " Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.081140 4816 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6" (OuterVolumeSpecName: "kube-api-access-rdhc6") pod "31fec07e-a834-4a80-9534-cfa4b1939ffc" (UID: "31fec07e-a834-4a80-9534-cfa4b1939ffc"). InnerVolumeSpecName "kube-api-access-rdhc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.174155 4816 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdhc6\" (UniqueName: \"kubernetes.io/projected/31fec07e-a834-4a80-9534-cfa4b1939ffc-kube-api-access-rdhc6\") on node \"crc\" DevicePath \"\"" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.574161 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553928-j48rh" event={"ID":"31fec07e-a834-4a80-9534-cfa4b1939ffc","Type":"ContainerDied","Data":"e22f86625762b25fb3daad30bead92f87f9412d99c562ef147e5e5a37a8d3809"} Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.574230 4816 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e22f86625762b25fb3daad30bead92f87f9412d99c562ef147e5e5a37a8d3809" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.574336 4816 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553928-j48rh" Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.659706 4816 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:28:05 crc kubenswrapper[4816]: I0311 13:28:05.666233 4816 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553922-l2chb"] Mar 11 13:28:06 crc kubenswrapper[4816]: I0311 13:28:06.145568 4816 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09c6fad-26f7-4ea2-84fc-5d2efb86fd02" path="/var/lib/kubelet/pods/a09c6fad-26f7-4ea2-84fc-5d2efb86fd02/volumes" Mar 11 13:28:18 crc kubenswrapper[4816]: I0311 13:28:18.131000 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:28:18 crc kubenswrapper[4816]: E0311 13:28:18.131568 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:28:33 crc kubenswrapper[4816]: I0311 13:28:33.130613 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:28:33 crc kubenswrapper[4816]: E0311 13:28:33.131762 4816 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b4v82_openshift-machine-config-operator(7fdff21c-644f-4443-a268-f98c91ea120a)\"" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" podUID="7fdff21c-644f-4443-a268-f98c91ea120a" Mar 11 13:28:48 crc kubenswrapper[4816]: I0311 13:28:48.130920 4816 scope.go:117] "RemoveContainer" containerID="1217d235a3bfa975a8546784eae4eeacb0575927046672aa572fbaac0320a911" Mar 11 13:28:48 crc kubenswrapper[4816]: I0311 13:28:48.955163 4816 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b4v82" event={"ID":"7fdff21c-644f-4443-a268-f98c91ea120a","Type":"ContainerStarted","Data":"f76b45455d648ad0872263a5ce07c44e2a022af34c34e5471ca7cdfb3d0e62e9"} Mar 11 13:28:51 crc kubenswrapper[4816]: I0311 13:28:51.477875 4816 scope.go:117] "RemoveContainer" containerID="5610d0163da9f92dcf1f4addb326b68bb7bee62775e25ffcf227b46aacd6327b" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515154267117024456 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015154267120017365 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015154254030016504 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015154254030015454 5ustar corecore